All business-critical applications consist of many intertwined components. In Agile Development, these components are built individually in “scrums,” but eventually have to coexist and work together, possibly across many layers (UI, data, business logic). This underscores a fundamental problem among applications created using Agile techniques: How do you ensure that the end product performs reliably and dependably outside the production environment?
Some believe that to ensure software quality requires setting aside the technology and focusing just on the basics – the people and the process. This was the stance taken by Bola Rotibi recently in her piece on Application Development Trends titled, “Want Quality Software? Focus on People and Processes -- Not Technology.” In this article, Rotibi puts the first step to achieving software quality in very simple terms, “quality people equipped with the right processes are resolute criteria for the timely and successful delivery of quality software.”
She later points out that one of the best examples of “people and process” can be found within Agile Development, noting:
Any organization that has achieved success through Agile development practices will know that people attitude, culture, training, education and management buy-in and support are essential criteria for quality delivery and stakeholder satisfaction.
We generally agree with Rotibi’s views that the tenets of people and process inherent in the ideal Agile environment set a basis for success in terms of people and process. Where we differ is on the need for technology – namely automated analysis and measurement – applied at the right point in the process.
The Agile Conundrum
The very nature of Agile programming conspires against software quality. Bits and pieces of functionality that will eventually become interdependent are created and tested separately in different scrums. New functionality is often added on top of old, which further muddies the architectural waters, threatens reliability and performance, and increases the cost to modify and maintain the software. Moreover, as the number of lines of code grows, architectural complexity grows exponentially.
At this point, performance bottlenecks and structural quality lapses become very hard to detect. This makes it very difficult to see and measure the structural quality – a measure of how the architecture and the implementation hang together to ensure reliable, dependable, mission-critical performance – of the application as a whole. Being able to find and fix critical architectural bottlenecks in a rapidly evolving code base reliably is the key to developing high-quality applications using Agile techniques.
Unfortunately, even the most qualified software engineer does not have the requisite insight into the thousands of classes, modules, batch programs, and database objects that need to work flawlessly together in the production environment. Most issues of robustness and performance are not hidden behind one specific artifact of code but exist in the interaction between multiple components created in separate scrums.
You Can’t Hit What You Can’t See
This is where you need to involve technology by introducing a system of automated analysis and measurement, which provides comprehensive visibility over component interconnections and assesses the structural quality of the application software as a whole, rather than each part individually. This is important because even if each scrum develops a perfect portion of the project, if these modules do not interconnect properly, the entire project is doomed. Providing visibility and quantification give software engineers the information they need to ensure high performance and reliability, no matter how rapidly the code base evolves.
CAST’s Application Intelligence Platform does what the human eye cannot do. It can read, analyze and semantically understand most kinds of source code, including scripting and interface languages, 3GLs, 4GLs, Web and mainframe technologies, across all layers of an application (UI, logic and data). By analyzing all tiers of a complex application, CAST measures quality and adherence to architectural and coding standards, while providing real-time system blueprints.
So yes, producing high quality software via Agile IS about people and process, but it is ALSO about technology!
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.