The two Sony Playstation security breaches that affected more than 100 million account-holders over the past couple weeks (77 million in the first with another 26 million last week) and exposed their personal information to hackers is just the latest example of how software code vulnerabilities can lead to the failure of mission-critical applications.
But what is being done about it? The New York Times’ Nick Bilton suggests that people are just expecting the Feds to step in and regulate things and that even Congress thinks this is where things are heading. He quotes Connecticut Sen. Richard Blumenthal as saying, “There needs to be new legislation and new laws need to be adopted [to protect the public]. Companies need to be held accountable and need to pay significantly when private and confidential information is imperiled.”
To Bilton’s credit, his reaction to this statement is, “But how?” He also goes on to note, “Technology also has a way of advancing far ahead of the law.” So far ahead, in fact, that he relates a story told by privacy and copyright attorney Christina Gagnier of a case heard before the U.S. Supreme Court last year in which Chief Justice John G. Roberts, Jr., the highest judicial officer in the country, “asked how text messaging works. If two messages are sent simultaneously, he asked, does one get a ‘busy signal’?”
So obviously the Feds aren’t up to the challenge…but who is?
Whether it’s the Sony Playstation hack or the system outages at financial organizations over the last few months, including the ones at the London Stock Exchange and NASDAQ, it seems many of these breaches begin with some point of vulnerability within the software code. Some of these vulnerabilities exist in newly created code while others extend from existing code on top of which newer applications are built.
While companies should be doing more during the build process to locate areas of potential risk, most do little or nothing.
Studies have shown that 0.025% (that’s one-fourth of one-tenth of a percent) of the lines of code in anaverage enterprise application contain vulnerabilities. A subset that miniscule doesn’t seem worth worrying about. Trying to find 0.025% is worse odds than trying to find a needle in a haystack. But when you consider that a the average business application contains over 400,000 lines of code, that still leaves roughly ONE HUNDRED points of infiltration for potential hackers!
Still, there’s no way a company could find the right 100 lines of code by hand and you can’t fix what you can’t find.
In his 1947 essay, “The Catastrophe of Success,” Tennessee Williams writes, “Security is a kind of death.” This is true for application software.
As applications become more sophisticated, there is no way you can stop hacks with traditional security software. If points of vulnerability within the structure are not addressed during the build process, even the best security system will only tell you when someone or something has breached your structure; it won’t keep them out. The problem must be solved by examining the code before the application is deployed.
Automated software analysis provides the means to see the whole application and go beyond one developer’s view of things like input validation, which provides an easy entry for a hacker, or any business transaction that might fail on its own. Automated measurement of that analysis provides management the means to track, incentivize and ensure that security, stability and efficiency traps are not introduced either inadvertently or maliciously into the enterprise software. In this way, if you can see the potential threat, you can eliminate it before it becomes a future security problem.
The ultimate baseline for security, therefore, should be assessing the structural quality of the application software before it is deployed to find and then fix potential breach points. If companies do not take it upon themselves to do this, their application software will continue to be a playground for hackers.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.