Happy New Year!!!...or is it?
It doesn’t matter how many days removed we are from sipping champagne, singing "Auld Lang Synge," making New Year’s resolutions, watching the ball drop in New York’s Times Square, and hitting the virtual "Reset button" to start a new year, we still need to look back at 2017, lest we be condemned to repeat the mistakes we made.
This is especially true if you are responsible for data security at the UK's National Health Service, British Airlines, the FCC, Deloitte, Equifax, the U.S. Securities and Exchange Commission, or any other organization that experienced a data breach due to application security failures in 2017.
The baffling part about these breaches at high-profile organizations is that they keep happening. Last year's breaches all seemed to result from the same or similar issues as the ones I wrote about in 2011. Apparently, these companies have allowed old acquaintances to be forgot, and have failed to bring to mind past issues...at least not in time to avoid falling victim to cyberattacks.
Crying Over Spilled Data
Just as Europe was embarking on its annual May banking holiday, the world was introduced to the WannaCry ransomware attack, which grounded the UK's National Health Service, British Airways and many other critical services to a halt. The breach exposed again the importance of IT systems in today’s business and shined a very bright spotlight on the number of vulnerabilities that exist in the critical, highly complex software used by vital infrastructure sectors including government agencies, airlines and telecom operators. Some of the problem is attributable to the lack of outsourcing control.
"With a majority of the IT systems in their second and third generations of outsourcing contracts, there is very little visibility they have in the underlying risk and security vulnerabilities in their IT estates," said CAST UK SVP Vishal Bhatnagar in his report on WannaCry.
But Bhatnagar adds that it's just as much a human issue as a control issue. "At the engineer’s level security is an afterthought, developers often think of themselves as ‘artists’, more than programmers that have to follow coding standards and best practices," said Bhatnagar. "Spending more IT budget on risk prevention means that there is less to spend on the delivery of technology innovation and a culture of 'Code now, fix later.' This is a cultural issue, which most managers outside of IT would recognize as one of the toughest to fix."
Can't Deny the Irony
When one of the world's best-known cyber terrorist groups, Anonymous, levels a denial of service attack on the ability of people to communicate with the U.S. Federal Communications Commission, you have to smirk at the irony of it.
While a communications agency having communication issues is ironic, the reasons for this border on the tragic and the negligent. It smacks of the kind of oversight highlighted by former U.S. CIO Tony Scott, when he said that a soon-to-be-released report covering a plethora of issues regarding the modernization of the government’s existing IT framework, fails to address one key issue – modernizing aging legacy systems.
“I think it’s a crisis that’s bigger than Y2K. It’s just creeping up on us slowly, month by month, year by year,” said Scott during an event hosted by CISQ during Cyber Week last year. “But there is a point in the future where there’s just not going to be the knowledgeable resources to keep the old stuff going on the one hand, and then not enough resources to migrate off of those old things on the other hand. It’s something that I think is a problem now and we really need to move aggressively to get it done.”
For one of the largest accounting firms in the U.S., its victimization via cyber-attack last year did not adversely affect many of its clients. While not very damaging, the breach was very embarrassing as Deloitte prides itself in helping other organizations thwart cybersecurity breaches.
Regardless of its impact, the attack on the email system at Deloitte, which exposed confidential emails for a few high profile clients, highlighted two big things: first, financial organizations continue to be a huge cybercrime target, and second, their software systems continue to be highly vulnerable and open to attack.
Deloitte was not alone among financial organizations that fell under attack in 2017, and it was far from the worst. Depending upon your point of view, that dubious distinction may have fallen to Equifax.
First, the consumer credit company was breached through an application vulnerability, resulting in the personal data of 143 million people being exposed. And as if that were not bad enough, the page on the Equifax web site where consumers dispute errors on their credit reports was then hijacked later in the year by a malicious third party. In the second breach, the takeover duped the Equifax site, then prompted visitors to update Adobe Flash. Rather than a Flash update, however, the link provided was for a file that implanted malware on users’ systems.
Compounding on the rough year 2017 had in application security, the U.S. Securities and Exchange Commission (SEC) finally revealed in September 2017 that its Electronic Data Gathering, Analysis and Retrieval (EDGAR) system was breached the previous year. The breach exposed nonpublic, insider information that may have provided cybercriminals with data that could yield a significant financial reward in traded stocks.
An Ounce of Prevention
So where do we go from here?
Historically, companies have satisfied themselves with "security software" that alerts them after a breach has happened so they can minimize damage. That thinking is on par with your household's smoke detectors, which allow you to escape a fire, but do little to prevent damage to your home. They're still a very important element in household safety, but there's a lot more that can be done to avoid the fires that cause those smoke detectors to sound.
Obtaining facts about the holistic performance and risk levels across mission-critical software systems is a fundamental starting point to preventing cybersecurity breaches. With an average of 5,000 new vulnerabilities emerging every year, automating system-level analysis for software security is key. Only after a clear understanding of software risk is available can smart decisions be made to protect the organization and its customers moving forward.
Hopefully we will see organizations take heed of lessons learned in 2017 so I have something else to write next year other than, “I told you so!”
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.