As Larry Quinlan, Global CIO, Deloitte Touche Tohmatsu Limited explains, “CIOs need the courage to make the investments that reduce technical debt -- and the knowledge and the team to know where and when to make those investments.”
Yet, despite the advances that give IT management proper visibility into the cost and quality of their application development, one issue still remains unresolved: accurate technical debt estimation. The issue resides in how technical debt is calculated and communicated to management. In most cases, only the changes to the code itself were factored in, while the validation effort was left out. Meaning when management asked for the finished product, IT kept telling them they needed more time to validate the changes in the code so it didn’t break any standing features. This is frustrating to management, who use estimation to plan future projects, manage resources, and dictate budgets.
Past estimations also didn’t take into account the size of an application, its interdependencies, and all the technologies connected to it. If you’re fixing a local violation in a line of code, it’s an easy fix. However, the order of magnitude to correct an issue increases tremendously when it involves multiple teams and technologies.
With so many variables, how can management ever get accurate estimates to address and reduce technical debt? How can they reduce the gap between estimation from development teams and the final cost for the defect correction delivery? It’s all about split and sum (split efforts into categories: code remediation, unit testing, integration testing and complete validation overhead – sum = sum of your efforts).
If development teams have structural quality tools that can be configured to model the cost of remediation and validation more accurately, they can generate realistic estimates of technical debt.
This is done by associating technical debt effort to each vulnerability identified, so each time a quality check is run, management receives a realistic estimate of the time to remediate and validate the code based off of past performance. For example, to correct a complex system-level defect, the code remediation effort would be 60 minutes, unit testing another 60 minutes, but integration testing and complete validation effort would more likely take hours, even days. Now management has a reliable, repeatable roadmap from which to make IT investment decisions.
This is great news for the industry, which up until this point viewed technical debt as an interesting notion, but far from an actionable metric. As a result development teams over-budgeted for new features, and under-budgeted code remediation and validation projects hoping that it all worked out in the end.
But that’s not the attitude data savvy CIOs can afford to have with shrinking IT budgets, and increased scrutiny from management and regulators.
Equipped with real-world estimates of technical debt -- including the remediation and validation of code defects -- management can proactively prepare their planning, resource management, and realistic budgets for future projects.
If your organization is interested in better understanding technical debt estimation, and other disruptive trends that are transforming business, government, and society in the next 18 to 24 months, download the Deloitte Tech Trends 2014, Inspiring Disruption:
“Technical debt is a way to understand the cost of code quality and the impacts of architectural issues. For IT to help drive business innovation, managing technical debt is a necessity. Legacy systems can constrain growth because they may not scale; because they may not be extensible into new scenarios like mobile or analytics; or because underlying performance and reliability issues may put the business at risk. But it’s not just legacy systems: New systems can incur technical debt even before they launch. Organizations should purposely reverse their debt to better support innovation and growth—and revamp their IT delivery models to minimize new debt creation.” – Deloitte Tech Trends 2014
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.