Earlier this year, Cloudfare’s outage left millions of customers without website access and Boeing’s unaddressed software flaw caused two tragic crashes, leading to the company reporting its largest-ever quarterly loss. These are just two of the growing number of software quality problems dominating the news agenda and crushing companies globally.
Software is the heart of every organisation, beating in the background to keep organizations up and running. Similar to human hearts, we expect software to perform perfectly all hours of the day. With software, when things go wrong, they do so spectacularly.
Ignoring software quality and failing to run regular health checks allows problems to stack up, often resulting in extreme repercussions and widened software security gaps for experienced hackers to take full advantage of.
CAST recently completed its 4th annual software intelligence report to understand what causes software failures and how to prevent them. The report is the largest of its kind, analyzing over 700 million lines of code across 14 different technologies.
The report found service-critical applications used by organizations were an average of 12-years-old, two years older than the average application analyzed. The majority of these were still anchored by legacy COBOL systems, which have been left without active development as they are written in code most developers no longer understand.
Whilst there is a strong focus on application modernization and new application innovation, IT leaders are prioritizing low-impact applications, as they no longer understand their maturing apps, living in fear of the potential repercussions that application modernization could inflict upon older apps. This leaves older high-impact software such as business-critical enterprise systems amassing software complexity and technical debt.
Given these findings, this issue is compounded further with nearly half, 46%, of business revenue-related systems serve both internal and external customers. Without active development or application modernization, these systems pose a greater risk of lapsing into failure which will have widespread effects.
Only 25% of business-critical apps scored higher than 87 (out of 100) in application resiliency, which measures the ability of an application to recover from certain types of failure, whilst still providing an acceptable level of service to the business. Low application resiliency often result in extended down time, frustrating users and driving up running costs.
The report confirms development teams are significantly understaffed by 65% when compared to the Constructive Cost Model (COCOMO II); a trusted method which estimates the effort, cost and schedule for software projects. Currently development teams have 5.5 full time equivalents (FTEs), whereas COCOMO II recommends 16 FTEs to be allocated for maintenance activities. The mounting pressure placed on these short-staffed teams to develop new functionality and maintain increasingly complex systems leads to high employee turnover and outsourcing. With high levels of employee turnover, the continuity of systems’ maintenance is disrupted, often leaving complex code behind which is hard to decipher, snowballing technical debt.
In a constantly advancing digital age, accelerated by the rapid rise of IoT devices, companies will struggle to keep pace managing and maintaining their systems without insights.
Software Intelligence, a deep analysis of the software architecture, provides unprecedented visibility into the inner workings of all systems, legacy or new. Highlighting software quality & security risks, code complexity, technical debt and several other software health metrics to ensure a better understanding of the software problems companies face today and in the future.
With this information, architects are able to make informed decisions, reducing the business pressures on those teams which can force them to take short-cuts, ultimately reducing costs as well as risk.
If businesses want to protect their reputation and build customer satisfaction, they must recognize the importance of their technology and the people who maintain it. Without significant investment and focus on these two key factors, organizations will cave in on themselves, and be left in a dilapidated state, and no-one will be left to pick up the pieces.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.