An Encore for Equifax?


When you are a consumer credit company, victimized recently by a serious security breach where hackers exploited an application vulnerability to steal the personal information of roughly 143 million people, what do you do for an encore? For Equifax, the encore may be “get hacked a second time.”

As reported by a security analyst last week, consumers trying to dispute errors on their credit report  were prompted by that page on the Equifax web site to update Adobe Flash. Rather than a Flash update, however, the link provided was for a file that implanted malware on users systems. Upon learning of the issue, Equifax took down the page for “maintenance,” and the company is currently looking into what happened. Nevertheless, as the company is already the target of a class action suit over the September breach, the current issue will likely multiply the damage Equifax’s already shaky reputation, not to mention its wallet.

Like so many infringements upon corporate security before it, the Equifax breaches (or breach, depending upon the outcome of last week’s debacle) call attention to the damage that poor software structural quality – which plays a leading role in application vulnerability – can do to a company’s operations. Such consistency leaves one asking just how a company can avoid falling victim to hackers exploiting the issues caused by poor structural quality that lead to business interruption.

Structural quality is essential for managing the root drivers of IT costs and business risks in mission-critical applications. Unlike the quality of the process by which software is built, enhanced and maintained, functional, non-functional and structural quality have to do with the software product itself – the asset that generates business value.

Accurately analyzing and measuring the quality of applications (which typically have a large number of components interconnected in complicated ways) and its connections with databases, middleware, and APIs is monstrously complex. In most cases, developers wait until they deploy applications to perform a security assessment or repair defects, but patching is costly in the long term and is inefficient. Moreover, post-deployment testing places a company in the same uninformed position as the captain of the Titanic – you can see the top of the iceberg, but it’s what lies beneath the surface that causes the damage.

The best time to address software quality is early-on in the development process. Until recently, this was done by manually performing risk assessments or static analysis, but with the average application carrying more than one-million lines of code, this presents a huge waste of resources, time, and money.

By contrast, automated solutions capable of delivering software quality benchmarks offer organizations the chance to measure current application size and complexity based on source code characteristics accurately and efficiently. Developers can use them to monitor improvement efforts as they develop or maintain an application within a complex infrastructure. Incorporating automated analysis at the source code, component, and application system levels provides software quality benchmarks for enhancing quality, productivity, vendor value, and – because it finds the flaws that lead to application vulnerabilities – security.

While it remains to be seen whether the second Equifax mishap has a source similar to the first, companies should not wait to be breached before looking at the weaknesses in their applications. Using automated solutions to improve software structural quality and close the curtain on application vulnerabilities can save them from a disastrous encore

  This report describes the effects of different industrial factors on  structural quality. Structural quality differed across technologies with COBOL  applications generally having the lowest densities of critical weaknesses,  while JAVA-EE had the highest densities. While structural quality differed  slightly across industry segments, there was almost no effect from whether the  application was in- or outsourced, or whether it was produced on- or off-shore.  Large variations in the densities in critical weaknesses across applications  suggested the major factors in structural quality are more related to  conditions specific to each application. CRASH Report 2020: CAST Research on  the Structural Condition of Critical Applications Report
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
Making sense of cloud transitions for financial and telecoms firms Cloud  migration 2.0: shifting priorities for application modernization in 2019  Research Report
Jonathan Bloom
Jonathan Bloom Technology Writer & Consultant
Jonathan Bloom has been a technology writer and consultant for over 20 years. During his career, Jon has written thousands of journal and magazine articles, blogs and other materials addressing various topics within the IT sector, including software development, enterprise software, mobile, database, security, BI, SaaS/cloud, Health Care IT and Sustainable Technology.
Load more reviews
Thank you for the review! Your review must be approved first
You've already submitted a review for this item

CAST Business Partner Day