“S” stands for security, something “S” organizations like Sony and Sega appeared to have too little of earlier this year. You could also say “S” represents the U.S. Dollar sign ($) that is associated with the FDIC and IRS, both of which have recently fallen victim to phishing attacks and have had their security compromised. Unfortunately, they are not alone; organizations that start with many letters of the alphabet have fallen victim to security issues this year.
The reason these organizations have been victimized leads us to another “S” term – sensitive information. Particularly in the case of the breach at a Department of Defense contractor in March, which was just revealed last month, the greater the sensitivity of the data being stored by the organization, the bigger they are as a target. Sensitive data has placed the United States in the crosshairs of many hackers – independent and those sponsored by foreign governments. The United States has become such a significant target of hackers that Secretary of Defense Leon Panetta recently noted, “more than 60,000 new malicious software programs or variations are identified every day threatening our security, our economy and our citizens.”
This leads us to our third “S” word of the day – software. First off, there’s the malicious software of which Panetta speaks – viruses, malware, Trojans – much of which gets caught by security software…if it is up to date. Surveys have shown that all too often that end users fail to update their virus software because 83% of them think their PCs are “clean.” As a result, we find that anywhere from 1 to 50 percent of all computers (depending upon which study you read) are infected by some form of malicious software.
But there is also the application software that already makes up a company’s IT system. While organizations claim extreme diligence when it comes to stopping malicious software, the impact of these external attacks could be dampened significantly, possibly even rendered moot, if more attention were paid to existing application software and another “S” term – structural quality.
Many companies find themselves with systems bereft with software that rates low in structural quality. This is due to a number of factors, the most common of which are poor code writing and antiquated software that holds latent vulnerabilities. Either of which can be exploited by hackers who seem to be growing smarter at a rate exponentially greater than those developing the security systems to stop them.
This portion of the blog comes courtesy of the recently released HP/Capgemini World Quality Report and is brought to you by the numbers “85” and “42.”
The first number stands for 85%. According to the World Quality Report, 85% of businesses now recognize that application software quality as a priority and a focal point for IT spending as the economy begins to rebound. In one recent article about the World Quality Report, Murat Aksu, global head of the HP Software Alliance at Capgemini, said, “Business leaders see application quality as the strategic cornerstone of their competitive economy.”
There is significance to the fact that software structurally quality has finally broken through as a top priority. This factor demonstrates that companies have taken to heart the issues with software failure and have begun to realize it is better business to build software correctly the first time than to try and fix failures, outages and security breaches after the fact (the basis for the technical debt concept). It seems like this revelation is almost a reaction to the predictions of Andy Kyte at Gartner who last year forecasted that technical debt will top $1 trillion worldwide by 2015.
But while 85% see application quality as being a competitive differentiator, only half – or our other sponsor for this section, 42% – plan to do something about it. Of this, Aksu says:
“We see that in the Western Hemisphere and EMEA [Europe, the Middle East and Africa], excluding Eastern Europe, economies are not doing that well. As a result, the IT investment is smaller, as is the investment in cloud computing and security testing, compared to the rest of the world.”
So even though nearly every business in the world understands that there’s a global problem with the structural quality of application software, more than half either cannot or will not do anything about it. That sounds like an awfully contrarian way for companies to keep their data from being trashed.
Only through increased diligence and strict attention to application software quality will businesses be able to sweep the hackers away and bring sunny days to “The Street.” Failing to do so will just leave them "grouchy."
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.