When you are a consumer credit company, victimized recently by a serious security breach where hackers exploited an application vulnerability to steal the personal information of roughly 143 million people, what do you do for an encore? For Equifax, the encore may be “get hacked a second time.”
As reported by a security analyst last week, consumers trying to dispute errors on their credit report were prompted by that page on the Equifax web site to update Adobe Flash. Rather than a Flash update, however, the link provided was for a file that implanted malware on users systems. Upon learning of the issue, Equifax took down the page for “maintenance,” and the company is currently looking into what happened. Nevertheless, as the company is already the target of a class action suit over the September breach, the current issue will likely multiply the damage Equifax’s already shaky reputation, not to mention its wallet.
Like so many infringements upon corporate security before it, the Equifax breaches (or breach, depending upon the outcome of last week’s debacle) call attention to the damage that poor software structural quality – which plays a leading role in application vulnerability – can do to a company’s operations. Such consistency leaves one asking just how a company can avoid falling victim to hackers exploiting the issues caused by poor structural quality that lead to business interruption.
Structural quality is essential for managing the root drivers of IT costs and business risks in mission-critical applications. Unlike the quality of the process by which software is built, enhanced and maintained, functional, non-functional and structural quality have to do with the software product itself – the asset that generates business value.
Accurately analyzing and measuring the quality of applications (which typically have a large number of components interconnected in complicated ways) and its connections with databases, middleware, and APIs is monstrously complex. In most cases, developers wait until they deploy applications to perform a security assessment or repair defects, but patching is costly in the long term and is inefficient. Moreover, post-deployment testing places a company in the same uninformed position as the captain of the Titanic – you can see the top of the iceberg, but it’s what lies beneath the surface that causes the damage.
The best time to address software quality is early-on in the development process. Until recently, this was done by manually performing risk assessments or static analysis, but with the average application carrying more than one-million lines of code, this presents a huge waste of resources, time, and money.
By contrast, automated solutions capable of delivering software quality benchmarks offer organizations the chance to measure current application size and complexity based on source code characteristics accurately and efficiently. Developers can use them to monitor improvement efforts as they develop or maintain an application within a complex infrastructure. Incorporating automated analysis at the source code, component, and application system levels provides software quality benchmarks for enhancing quality, productivity, vendor value, and – because it finds the flaws that lead to application vulnerabilities – security.
While it remains to be seen whether the second Equifax mishap has a source similar to the first, companies should not wait to be breached before looking at the weaknesses in their applications. Using automated solutions to improve software structural quality and close the curtain on application vulnerabilities can save them from a disastrous encore