Our society has a particularly annoying habit, one that’s not exclusive to any one specific walk of life, business or industry, nor is it one that we are likely to see our society give up anytime soon. The habit is known by several names, but is most commonly referred to as “finger pointing” or “the blame game.”
Politicians blame “the other party” for what ails the country. Sports teams blame the officials for losing. Unions blame “big business” for low wages and poor working conditions. Meanwhile, businesses point at their software for breaches of security.
Is the software always to blame, though? No. We’ve seen plenty of instances where security breaches of IT systems have been the result of a human being either divulging access information or inadvertently introducing a virus, worm or Trojan into the system by opening a seemingly innocuous email.
Even in a lot of those cases, though, even when human error is somehow involved, there needs to be some vulnerability within the software that can allow the security breach to take place.
Dropbox fell victim to the “human error exacerbates vulnerability” issue a few weeks ago. During a code update, the online storage service provider encountered a bug that disabled passwords for several hours. This exposed data belonging to Dropbox’s estimated 25 million customers for several hours.
One day after discovering the bug that led to the Dropbox customer data breach, Dropbox CTO and Founder Arash Ferdowsi owned up to the issue in his blog. He wrote, “This should never have happened. We are scrutinizing our controls and we will be implementing additional safeguards to prevent this from happening again.”
While on the surface Ferdowski’s comments seem like an admission of blame on behalf of the company, they miss an obvious point – this upgrade never should have seen the light of day with that bug in place. What’s more, in saying the company will “scrutinize our controls”, it appears that sufficient steps were not taken to ensure the structural quality of the code prior to deployment. Once again, the software is being blamed rather than the lack of sufficient application assessment.
It does not matter if a bug is introduced during an upgrade or if it is a latent vulnerability in existing legacy code; it is still the company’s responsibility to know about it before it affects its customers.
The Dropbox exposure is just more evidence that companies need to improve how they go about ensuring the structural quality of their software – whether they’ve written the software themselves or they are customizing or upgrading existing software.
Ignoring the problem and just hoping that the software won’t fail or expose sensitive information should not be an option. Companies should be held to a much higher degree of responsibility than that.
Unfortunately, among those companies that do subscribe to the belief that they are ultimately responsible for all that happens with their systems, too many continue to try and assess their software manually. But application software of today is extremely complex; studies have shown the typical application includes approximately 400,000 lines of code, of which there are on average 100 lines that include errors. Trying to find those 100 errors in 400,000 lines of code (0.025% or one-fourth of one-tenth of one percent) by manual means is not only time consuming and expensive, but also horribly ineffective.
Companies that truly want to own all that happens with or because of their software need to focus on an automated approach to application assessment. Only by using some form of automated structural analysis will a company have an optimal chance of finding all the flaws in its software before they become public.
Any company that doesn’t perform a thorough structural assessment of their application software is gambling with it and it will only be a matter of time until it, too, “drops the ball.”