But what the Bard of Avon once penned as Henry V’s words exhorting his troops to fight has, in the age of business technology, devolved into a lament sung by companies that store personal data as they suffer the humiliation of data breaches. Few things impact today’s businesses so thoroughly and in so many ways – lost business, lost income, lost customers and damaged reputation to name a few of the consequences.
The Ponemon Institute recently released its annual study sponsored by Symantec on the financial impact of data breaches on U.S. companies and the numbers for 2010 are staggering. The average cost of a data breach rose more than 14%, to $7 Million in 2010 with the costliest breach robbing $35.3 million from its corporate victim. While the per-record cost of the breaches, taken on face value, does not seem so imposing, having risen to $268 each compared to $174 the previous year, considering that breaches normally affect tens of thousands of records, this number quickly reaches astonishing totals.
While the Ponemon Institute cited “negligence” as the leading reason for a data breach, with 41% of breaches being blamed on it, an astonishing 31% of breaches were the result of “Cybercrime.” In other words, in nearly a third of all data breaches, there was a vulnerability somewhere in the software that some “enterprising” individual was able to exploit to gain access to sensitive information.
The Gates of Mercy Shall be all Shut Up
In addition to the numbers it used to illustrate the severity of the data breach problem for U.S. businesses, the Ponemon study also noted that the reaction of companies plays a significant role in either lessening or exacerbating the damage caused by the breach. Surprisingly, the study showed that companies that took their time, investigated the breach, and notified only those affected by the breach were more apt to come out of the attack minimally impacted. The report said that those that reacted quickly often failed to investigate the breach completely and overnotified those potentially affected, resulting in customer panic and lost business, the latter of which accounted for 63% of the financial impact of the average data breach in 2010.
The Ponemon Institute does offer some thoughts on preventative solutions, including: encryption, including whole disk encryption and for mobile devices/smartphones; data loss prevention (DLP) solutions; identity and access management solutions; and endpoint security solutions and other anti-malware tools...but all of these amount to closing the gates after the horses have left. Nowhere does the report discuss preemptive measures such as conducting automated analysis and measurement of the application software to detect potential vulnerabilities and help eliminate them, which would preclude the opportunity for a breach.
In Your Fair Minds Let This Acceptance Take
When human beings “are breached” – aka, when we are sick – it is almost always the preferred and accepted practice to treat the cause and not just the symptoms. Would it not stand to reason, therefore, that the accepted practice upon the occurrence of a data breach would be to address the cause – aka, the flaw in the structural quality of the application software? Or better yet, make it accepted practice to address the structural quality of the application software before deployment and not leave it open to a potential breach.
Software quality is critical because most attacks and system faults occur at the application layer. Structural flaws and vulnerabilities can expose application software to attack. By performing automated analysis and measurement of the application software using a platform such as CAST’s Application Intelligence Platform, companies can identify precise areas within the software code that leave Web-based applications vulnerable to hackers. Given the ability to see the potential issues, the business can then protect its business-critical systems and not fall “Once more into the breach.”
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.