Cybersecurity is a hot-button issue these days. You can barely go a few weeks without hearing about a company suffering a breach that puts the business at risk. With all eyes focused on making software more secure, a happy side effect might just be a streamlining of software modernization initiatives.
It might not be as sexy as AppSec, but modernization efforts are crucial to the future of business. Speaking at the Cyber Resilience Summit hosted by the Consortium for IT Software Quality (CISQ) last month as part of CyberWeek in Washington D.C., former U.S. CIO Tony Scott said a soon-to-be-released report covering a plethora of issues regarding the modernization of the government’s existing IT framework, fails to address one key issue – modernizing aging legacy systems.
“I think it’s a crisis that’s bigger than Y2K. It’s just creeping up on us slowly, month by month, year by year,” said Scott. “But there is a point in the future where there’s just not going to be the knowledgeable resources to keep the old stuff going on the one hand, and then not enough resources to migrate off of those old things on the other hand. It’s something that I think is a problem now and we really need to move aggressively to get it done.”
In the IT realm, new technologies routinely outgrow existing software, but businesses continue to rely on older software because it works. Why fix what isn’t broken, right? Or why completely replace a system that can be fixed with a software patch?
Unfortunately, there comes a critical point in application lifecycle where it’s been fixed, repaired and touched-up so many times over the years that it becomes a piecemealed system, riddled with complexity and comprised of faulty components and countless languages. By the time an application reaches this state, it’s common that there’s nobody left in the IT department who knows how to keep the system running!
The ability to maintain legacy systems is a direct result of its transferability – which determines how easily someone can become productive when first assigned to work on the application – and its changeability – how easily and quickly an application can be modified. The better software performs in these areas, the easier it will be to maintain when it reaches legacy status.
This begs the question, “How do we know if the software is transferable and changeable?”
One proposal is to form “a Cyber National Guard that brings private sector technology professionals to the federal government on a temporary but continuous basis to provide expertise on securing and innovating networks.” Another is to use automated code review.
When employed during the development stage, automated code review tracks the application as it’s built and determines if the code is too complex, if it lacks security characteristics or if it is written in a manner in which it may be easily transferred or changed. After it has become a legacy system, automated assessments help teams navigate architectural soundness, perform impact analysis and model changes to the application to better understand potential dangerous data access paths that offer hackers a backdoor.
Much of this analysis is already being done to strengthen application security. If companies can multi-task and put this kind of system-level analysis to good use for modernization efforts, we will see stronger, more efficient and more robust software to power the future of business.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.