Because the world of software development is so incredibly complex and modular, quality assurance and testing for software risk has become costly, time-consuming, and at times, inefficient. That’s why many organizations are turning towards a risk-based testing model that can identify problem areas in the code before it’s moved from development to testing. But be careful, because hidden risks can still exist if you don’t implement the model properly throughout your organization.
What is a risk-based testing model? It’s a method of prioritizing functional tests based on the likelihood of failure, the importance of the functionality, and the weighted impact to the business if a failure occurs. However, some of the hidden risks of risk-based testing are that it ignores two very crucial aspects of application development: the structural quality of the system, and what complex components were changed during development.
I recently gave a presentation about some of the problem areas in risk-based testing and how to implement it in an organization at the TesTrek Toronto 2013 conference. The reception was so great, the South Western Ontario Software Quality Group asked me to perform a special webinar of my presentation, which you can view on-demand here. So be sure to download the presentation and slides if you want to find the holes in your testing strategy.
Some of what we’ll cover will include:
- Managers need to know and understand the tools that their development teams are using.
- Understanding how different parts of an application, and its integration points, introduce hidden risk into an organization.
- Understanding that though one piece of code might be a small part of the overall application, its integration with the entire application makes it riskier than one bug fix.
Let’s face it, software development organizations have become too siloed. Management rarely has any insight into how the application actually gets made, what types of tools its developers are using, or if they’re even using any to begin with. And its commonplace now for organizations to use developers from countries all over the globe who are building their individual components with no collaboration with other teams. So when the “final” application gets sent off to testing, they have no idea what they’re getting. And because the application is so complex, addressing every bug quickly becomes an insurmountable task.
Because of this, when an organization begins implementing a risk-based testing strategy, there’s usually a make or break moment when the organization realizes it’s either going to work, or fail miserably. Well recently, I was able to witness one of those ‘aha!’ moments first-hand.
I was doing a software risk assessment on an application that included a Java component and a COBOL mainframe component. And these components talked to each other a lot. The amusing thing was that assessment was the first time the two programmers had ever met, even though their particular components were integral to each other. And it was the first time they’d actually seen how their components were interacting with each other.
So when we presented them with the list of problems in their components, we were expecting the usual back and forth screaming match about whose code was at fault. But it never happened. Rather, we had a holistic conversation about how these components interacted and how these two developers could optimize their code to get it to run perfectly.
This is what makes risk-based functional analysis tools -- like our Application Intelligence Platform -- so powerful. It gives organizations the ability to understand how each piece of the application fits into the entire development process as a whole. Now, rather than simply throwing it over the wall, developers can create an outline for testers showing them exactly where the most risk lives in the codebase. It’s like giving your Q&A its own drone program.
Don’t get stuck trying to use 20th century technology to fix a 21st century problem. Download the webinar, “Your Risk-Based Testing Is Missing The Real Risks”, and learn how to keep your organization’s application portfolio free of hidden risk.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.