Today's Modern Applications will be the Legacy Applications of Tomorrow

by

Application modernization has always been a priority in the IT world. Whatever the reason, applications must always be regularly adapted to the next modern environment to prevent business disruption and enable disruptive ways of working. Of course, we saw this happen nearly 20 years ago for Y2K. In more recent years, however, we have seen mainframe replacement and cost reduction goals drive most of the transformation narrative. In a nutshell, we are speaking about legacy modernization.

Legacy Modernization Tooling Must Enable System-Wide Transparency

I have contributed to several system migration projects, and each time we considered a similar scenario. Tooling was deployed by a dedicated team and set up with a project perspective only. Goals were multiple: it was mandatory to analyze the system or the application to identify the project boundary. In addition, it was necessary to organize the project by creating batches to identify software risk across the perimeter. Creating batches requires teams to know how components are linked together and which components are needed to perform correct test plans. Identifying related software risk means knowing which components are not compatible with the target, which ones have a poor quality or are classified as dangerous.

This tooling was most of the time very efficient but, at the end of the project, it was often abandoned because the team in charge left and the benefit for day-to-day work was not always well perceived. This is a shame, because teams can and should continue to use these tools throughout application lifecycle and on other initiatives beyond legacy application modernization.

Why Legacy Modernization Tools Should Be Used Across Application Portfolios

Tooling used by legacy modernization projects can also benefit the day-to-day work of developer teams. In maintenance, it is important to evaluate the impact of an incident on the application to identify answers to questions like: “Which component failed?” and “Where can the problem propagate?” Teams in charge of maintenance effort must be in a position to understand how the system is organized and if there are components considered at risk that must be fixed or replaced.

The same modernization tools are useful for new development as well. Developers must know how the code they will develop must be integrated in the existing software. It is also important for them to be sure they do not introduce new software risk into the applications.

I recently met with a company to explain how application analysis tools provide value, and they voiced two primary concerns:

 

  1. Their applications were mainly implemented in COBOL and .NET, and only few people know how they are structured. Existing documentation was outdated, and it was very challenging for new team members to ramp-up quickly.
  2. There was quite a bit of suspicious code across the application portfolio. They wanted to update and modernize these codebases, but they lacked an understanding of how the code changes would impact the overall application. They wanted to use modernization tools but were unable to estimate cost or time to modernize. Not to mention, they didn’t want code changes to break the app.


Software Intelligence Aids Legacy Modernization Efforts

Software Intelligence helps teams perform analysis at both the system- and application-level. It’s important to look for a solution that provides an overview of the application and identifies software risk at a global point of view. Once the “heat map” of risky components is identified, teams can drill-down into specific violations to manage action plans for the remediation of critical issues.

Another positive benefit of Software Intelligence is that it will shed light on how systems and applications are structured and how components are all connected. These capabilities are valuable for maintenance, new developments and legacy modernization efforts. Not to mention, this insight will help teams prolong the life of modern applications and slow their evolution into becoming the legacy applications of tomorrow.

Filed in: Technical Debt
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
Making sense of cloud transitions for financial and telecoms firms Cloud  migration 2.0: shifting priorities for application modernization in 2019  Research Report
Jerome Chiampi
Jerome Chiampi Product Owner
Jerome Chiampi is a Product Owner at CAST and is responsible for helping clients leverage Software Intelligence products to reduce software risk. He has 20 years of experience working in the software industry and is a trained software development engineer with expertise in assessing software and application security.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|