Once More into the Breach

by

“Once more into the breach, dear friends…” wrote William Shakespeare in his epic work, Henry V.

But what the Bard of Avon once penned as Henry V’s words exhorting his troops to fight has, in the age of business technology, devolved into a lament sung by companies that store personal data as they suffer the humiliation of data breaches. Few things impact today’s businesses so thoroughly and in so many ways – lost business, lost income, lost customers and damaged reputation to name a few of the consequences.

The Ponemon Institute recently released its annual study sponsored by Symantec on the financial impact of data breaches on U.S. companies and the numbers for 2010 are staggering. The average cost of a data breach rose more than 14%, to $7 Million in 2010 with the costliest breach robbing $35.3 million from its corporate victim. While the per-record cost of the breaches, taken on face value, does not seem so imposing, having risen to $268 each compared to $174 the previous year, considering that breaches normally affect tens of thousands of records, this number quickly reaches astonishing totals.

While the Ponemon Institute cited “negligence” as the leading reason for a data breach, with 41% of breaches being blamed on it, an astonishing 31% of breaches were the result of “Cybercrime.” In other words, in nearly a third of all data breaches, there was a vulnerability somewhere in the software that some “enterprising” individual was able to exploit to gain access to sensitive information.

The Gates of Mercy Shall be all Shut Up

In addition to the numbers it used to illustrate the severity of the data breach problem for U.S. businesses, the Ponemon study also noted that the reaction of companies plays a significant role in either lessening or exacerbating the damage caused by the breach. Surprisingly, the study showed that companies that took their time, investigated the breach, and notified only those affected by the breach were more apt to come out of the attack minimally impacted. The report said that those that reacted quickly often failed to investigate the breach completely and overnotified those potentially affected, resulting in customer panic and lost business, the latter of which accounted for 63% of the financial impact of the average data breach in 2010.

The Ponemon Institute does offer some thoughts on preventative solutions, including: encryption, including whole disk encryption and for mobile devices/smartphones; data loss prevention (DLP) solutions; identity and access management solutions; and endpoint security solutions and other anti-malware tools...but all of these amount to closing the gates after the horses have left. Nowhere does the report discuss preemptive measures such as conducting automated analysis and measurement of the application software to detect potential vulnerabilities and help eliminate them, which would preclude the opportunity for a breach.

In Your Fair Minds Let This Acceptance Take

When human beings “are breached” – aka, when we are sick – it is almost always the preferred and accepted practice to treat the cause and not just the symptoms. Would it not stand to reason, therefore, that the accepted practice upon the occurrence of a data breach would be to address the cause – aka, the flaw in the structural quality of the application software? Or better yet, make it accepted practice to address the structural quality of the application software before deployment and not leave it open to a potential breach.

Software quality is critical because most attacks and system faults occur at the application layer. Structural flaws and vulnerabilities can expose application software to attack. By performing automated analysis and measurement of the application software using a platform such as CAST’s Application Intelligence Platform, companies can identify precise areas within the software code that leave Web-based applications vulnerable to hackers. Given the ability to see the potential issues, the business can then protect its business-critical systems and not fall “Once more into the breach.”

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom
Jonathan Bloom Technology Writer & Consultant
Jonathan Bloom has been a technology writer and consultant for over 20 years. During his career, Jon has written thousands of journal and magazine articles, blogs and other materials addressing various topics within the IT sector, including software development, enterprise software, mobile, database, security, BI, SaaS/cloud, Health Care IT and Sustainable Technology.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|