I recently found myself in yet another endless discussion about how bug fixes and extra capacity impact the results of a Software Analysis and Measurement (SAM) assessment.
My interlocutor's first reaction is that it must be the computing configuration (i.e., the way to turn quality findings into an assessment score, status, etc.) that changed. Fixing bugs or adding extra capabilities won't have that impact on assessment results. Therefore, keeping the computing configuration stable keeps the results stable.
Then, after explaining that finding new or more accurate dependencies would impact the SAM assessment results -- thanks to a better understanding of complex behaviors, for instance -- my interlocutor reluctantly accepted that it can have a tiny impact, but by no means a dramatic one. His main argument was this: In real life, one would not lose a certification because of additional knowledge. And this is where I tend to disagree with most when dealing with risk.
For example, when assessing the safety hazard of a plant:
- Would the knowledge that a given construction material is a carcinogen not change the assessment result?
- Couldn’t this cause a small or dramatic effect, depending of the amount of hazardous material found in the audited plant?
- And wouldn’t the results change in an unpredictable way as, up to this point, no one cared about measuring the amount of the hazardous material?
At this point, my interlocutor started to become evasive because he still could not accept such changes in the SAM world.
- What if I know that you have a proven CWE vulnerability in your code?
- Should I keep silent, as you would not accept a dramatic impact of your assessment?
- Should I minimize the risk, as you would only accept a tiny impact on the assessment outcome?
That is basically what 99 percent of people ask for (I should say 100 percent, but I would rather leave room for some people that remain rational in the digital world of IT).
Is a dramatic change disturbing? Yes, of course. But isn't it also disturbing in the real world? Knowing what it will cost you to remove asbestos from the 56 floors of the Montparnasse Tower must be disturbing. I read it could cost up to 800,000 EUR per floor.
But that doesn’t change the fact that asbestos is now known to be a health hazard. I understand that some people -- most likely the ones signing the checks -- would be willing to say that the tower is as safe a place to work in as it was before the world knew asbestos was a health hazard and before the asbestos level was measured in the tower. But that is not a reason to hide the truth.
So the question now becomes: How do we handle the change?
To answer this question, we can look to the non-IT world (let me call it "the real world" from now on).
I also happen to work on a roll-bearing assembly line. Whenever a cutting tooth from a CNC cutter needed to be changed, not a single person in the plant would assume that you could fire up the cutter right away. Not before a proper re-calibration of the cutter had been done.
As for just-in-time strategy and productivity measurement in industrial process and basic house-cleaning principles, it seems the IT world is so different -- or even superior -- that the real world's principles would not even apply.
- How many people, even in the workplace, get their computer so full of garbage files and programs that they end up buying a brand-new computer? As if they would hoard junk in their home or office, then move to another home or office when the first one is full. (I know it does happen, but it usually ends up in reality TV shows.)
- How many IT professionals think that productivity measurement is only about the produced volume of code, and completely disregard the quality of the production? The industrial world knows for a fact that volume without quality is not the path to growth a competitive edge.
- How much effort does it take to convince IT professionals that just-in-time strategies and event-driven architecture can yield the same responsiveness to business requirements and the same resource-usage efficiency in delivering IT outcome as it did in assembly lines? To their benefits, it took many decades to convince the industrial world of these benefits. The pity is that IT doesn’t have the excuse of being the pioneer in this domain; it has a huge amount of knowledge and experience to leverage. Yet, it seldom is.
SAM is not the only measuring activity in the world. However, its practitioners need to reach the same level of maturity as their real-world counterparts.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.