When Good Software Goes Bad

by

Another week, another software failure.

Last week on the East Coast Main Line, which connects London to Edinburgh, a software malfunction left five trains stranded mid-track and significantly delayed others after a power supply issue knocked out the signaling system. According to reports, software that should have instructed the backup signaling system to kick in failed to function, causing all signals on the line to default to “Red,” halting trains where they stood. The failure left more than 3,000 rail passengers stranded or delayed for more than five hours on a Saturday afternoon.

Software failures like this one have become all too commonplace in recent years. We treat news of software failures as though they were inevitable and almost expected. But why? When exactly did we decide that software failure was an unavoidable part of business?

Shouldn’t we do a better job of assessing the structural quality of software before it is deployed rather than waiting for it to fail and then fixing the problem? After all, we know what causes poor software quality:

  • Business Blindspot: Regardless of the industry, most developers are not experts in their particular domain when they begin working for a company. It takes time to learn about the business, but most of the learning, unfortunately, comes only by correcting mistakes after the software has malfunctioned.
  • Inexperience with Technology: Mission critical business applications are a complex array of multiple computer languages and software platforms. Rather than being built on a single platform or in a single language, they tend to be mash ups of platforms, interfaces, business logic and data management that interact through middleware with enterprise resource systems and legacy applications. Additionally, in the case of some long-standing systems, developers often find themselves programming on top of archaic languages. It is rare to find a developer who knows “everything” when it comes to programming languages and those who don’t may make assumptions that result in software errors that lead to system outages, data corruption and security breaches.
  • Speed Kills: The pace of business over the past decade has increased exponentially. Things move so fast that software is practically obsolete by the time it’s installed. The break-neck speeds at which developers are asked to ply their craft often means software quality becomes a sacrificial lamb.
  • Old Code Complexities: A significant majority of software development builds upon existing code. Studies show that developers spend half their time or more trying to figure out what the “old code” did and how it can be modified for use in the current project. The more complex the code, the more time spent trying to unravel it…or not. In the interest of time (see “Speed Kills” above) complexity can also lead to “work arounds” leaving a high potential for mistakes.
  • Buyer Beware: Mergers and acquisitions are a fact of life in today’s business climate and most large applications from large “acquirers” are built using code from acquired companies. Unfortunately, the acquiring organization can’t control the quality of the software they are receiving and poor structural quality is not immediately visible to them.

Assessing the Answers

OK, so we know what the problems are, but how do we fix them? First of all, software issues need to be dealt with before they become a problem, not after 3,000 passengers are left stranded on the tracks or a stock exchange is forced to halt trading. To ensure sound structural quality out of the gate, application software should be assessed during the build process using a platform of automated analysis and measurement.

Not only can automated analysis and measurement resolve the code issues that often accompany acquired software, building on top of old software, rapid development and developer inexperience, it also grants significant visibility to the work being done by individual developers. While such scrutiny may seem invasive – like “Big Brother” watching over their shoulders – static analysis via automated analysis and measurement can actually be an effective tool for professional development. After all, if you don’t know where someone needs help, you can’t provide it.

Whether it’s the software or the developer, though, automated analysis and measurement grants visibility into the issues and provides a basis that leads to improved software quality. And ultimately, optimal software quality – not just “good enough” software quality – should be every company’s goal.

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom Writer, Blogger & PR Consultant
Jonathan is an experienced writer with over 20 years writing about the Technology industry. Jon has written more than 750 journal and magazine articles, blogs and other materials that have been published throughout the U.S. and Canada. He has expertise in a wide range of subjects within the IT industry including software development, enterprise software, mobile, database, security, BI, SaaS/Cloud, Health Care IT and Sustainable Technology. In his free time, Jon enjoys attending sporting events, cooking, studying American history and listening to Bruce Springsteen music.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|