It’s not uncommon for organizations to hold onto their application software and IT systems longer than they should. This is particularly true for government agencies – Federal, state and local. When you combine an “if it ain’t broke, don’t fix it” mentality with budget cuts and comfort levels of staffers, there is little impetus for change.
Clifford Gronauer, CIO of the Missouri State Highway Patrol, discovered just such a system last year. Gronauer was charged with upgrading the patrol’s aging IT system. Upon vetting the scope of the project, he found an antiquated system of mainframe-based legacy applications that dated back to the 1970’s!
The project turned into what Gronauer termed a “perfect storm” of upgrades that forced him to alter his plans from upgrading the system piece-by-piece to doing a complete overhaul broken into larger phases. On the bright side, he stumbled upon a Federal grant that would pay for the project and, in the end, the task earned him recognition as a finalist for the 2011 MIT Sloan CIO Symposium Award for Innovation Leadership.
I can only imagine Gronauer’s reaction when he realized the enormity of the fix that was going to need to happen. It must have been something akin to the one Roy Scheider’s character had in the original “Jaws” film when he first laid eyes upon the monstrous great white – “We’re gonna need a bigger boat.”
Digging for Clues
Dealing with legacy applications is never fun; in fact, it probably leaves many CIOs scratching their heads and wondering why their predecessors never bothered to upgrade the system. Since there’s no way to retroactively upgrade the application software, they have no choice to move ahead and make the best of what is in existence.
This poses a significant problem, though. The average IT manager and most CIOs out there are around my age, and I was in grade school when the Missouri State Patrol’s old system was implemented. This means it’s highly unlikely that even the most senior members of the IT department will have had experience with the code used to write the legacy apps.
The problem this unfamiliarity goes beyond just trying to rewrite old code or untangle the system in order to transfer data. Equally complex, if not nearly impossible, is figuring out where the old mistakes were – if nobody knows what’s right, how would they know what’s wrong? This makes finding fixes for old issues problematic at best. Workarounds and just ignoring the issues, hoping they won’t pose a problem down the road, are the most frequent answers, but sidestepping the problem is akin to failing to interview eyewitnesses during a crime investigation. It’s the kind of action that results in the poor structural quality that results in future failures or even crimes crimes being committed (i.e., hacking due to unforeseen security vulnerabilities).
Since dumb luck is no way to establish a foundation for a new or upgraded IT system, a company building up from a system of legacy apps needs to analyze what it has fully and then continually assess the build as it is being done.
Manual analysis of any application software build is cumbersome, time consuming and highly inefficient – like finding a single needle in 4,000 haystacks. Multiply that difficulty by the fact that the person doing the manual analysis of the legacy app probably doesn’t even know what the needle looks like and the chances of finding the culprit become infinitely small.
On the other hand, an automated assessment platform can conduct an investigation of hundreds of thousands of lines of code quicker and with a far better understanding of what it is looking for. By automating the process of static analysis, companies can ferret out offending legacy code and give those responsible for the upgrade a solid structure upon which to build. And employing this same platform of automated analysis and measurement to conduct continual architectural and code component reviews to find any new issues that arise ensures what is being build atop the legacy application interacts properly with the existing code.
This level of attention to structural quality is crucial in the constant fight of the IT department to eliminate outages and security vulnerabilities. So crucial in fact, that failing to conduct automated assessment when building on top of a legacy application should certainly be considered a crime.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.