Digital Transformation Keeps Software Complexity from Becoming a CIO’s Legacy

by

They say “if something works, don’t fix it.” This old adage may be the reason behind why some organizations hold onto legacy systems longer than they should, but it is also the reason why these same organizations struggle with software complexity. In fact, according to the GAO, Uncle Sam spends 80 percent of its $86.4 billion IT budget on legacy systems.

Some organizations choose not to struggle, though. Take for example the story of Pennsylvania-based underwriter NSM Insurance Group. It was reported recently on SearchCIO.com that NSM last year purchased a company that still had a COBOL-based back-office system from the 1990’s.

As the article’s author Mary Platt explains, “The legacy system did the work the acquired company needed, but it required a niche firm to maintain it at a significant cost and, moving forward, it couldn't handle NSM's business requirements.”

Fortunately, NSM CIO Brendan O’Malley wasn’t nostalgic about the COBOL developed system. He notes that the decision to replace it was rather clear-cut and straightforward.

Many CIOs scratch their heads in bemusement wondering why their predecessors never bothered to upgrade the system before scrapping them like O’Malley did. Others, however, throw up their hands, say there’s no way to upgrade legacy software applications, and just forge ahead trying to make the best of what is in existence.

This poses a significant problem, though. The average IT manager and most CIOs out there were in high school when COBOL systems were implemented. This means it is highly unlikely that even the most senior members of the IT department will have had experience with the code used to write the legacy applications, further adding to the system’s complexity.

And the issues go beyond just trying to rewrite old code or untangle the system in order to transfer data. Equally complex, if not nearly impossible, is figuring out where the old mistakes were – if nobody knows what’s right, how would they know what’s wrong? This makes finding fixes for old issues problematic at best. Workarounds and just ignoring the issues, hoping they won’t pose a problem down the road, are the most frequent answers, but sidestepping the problem is akin to failing to interview eyewitnesses during a crime investigation. It’s the kind of action that results in the poor structural quality that results in future failures or even crimes being committed (i.e., hacking due to unforeseen security vulnerabilities).

A CIO has two choices: conduct automated code analysis of the legacy application to assess the software complexity and system vulnerabilities before integrating new application on top of it, or transform like NSM did.

Reducing software complexity is one of the technical goals of digital transformation - like the one O’Malley conducted at NSM or the one at the American Cancer Society discussed in a February OnQuality blog - that enable the business goals to be realized. In today’s software-driven business world, Digital Transformation has become an enormous component of business transformation and software risk management. But transformation programs pose their own challenges.

Regardless of whether a CIO opts to stick with the organization’s legacy applications or undergo Digital Transformation, he or she would be wise to employ solutions that gain visibility into obstacles such as lack of visibility, excessive complexity and architectural vulnerabilities. All such software risks can be identified through system-level and architectural analysis solutions like CAST AIP can provide.

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom
Jonathan Bloom Technology Writer & Consultant
Jonathan Bloom has been a technology writer and consultant for over 20 years. During his career, Jon has written thousands of journal and magazine articles, blogs and other materials addressing various topics within the IT sector, including software development, enterprise software, mobile, database, security, BI, SaaS/cloud, Health Care IT and Sustainable Technology.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|