Living Up to Standards

by

By definition, standards are supposed to be a set of bare minimum requirements for meeting levels of acceptability. In school, the students who took the “standard” level courses were those who were performing “at grade level” and just focused on graduating. Every April in the United States we need to decide whether we will take the “standard deduction” – the bare minimum we can claim for our life’s expenses – or do we have enough to itemize our living expenses and therefore deduce more from our base income before taxes.

In other words, standards are the vanilla ice cream of business requirements.

When it comes to Technology, standards are no different. They still represent baseline requirements for quality. What is different in Technology, however, is that the elements that make up those standards expand and grow beyond the parameters that the original formulators of standards could ever have imagined.

Even Bill Gates is alleged to have said in 1981 that, “640K software is all the memory anybody would ever need on a computer.” He has since vehemently denied ever having said that, but the point is that at the dawn of the computer age in 1981, nobody could have foreseen the need for megabytes of memory as standard on a computer, let alone gigabytes or even terabytes.

But as computer capabilities increase, so do the standards.

Government Standard

There was a time when the government standard for sharing information was really quite simple. Whether you called it “Loose lips sink ships” or just “Keep your mouth shut” it was a low-tech standard for low tech times.

With information sharing hurdling into cyberspace, however, it now takes more than someone’s silence to ensure that information is shared only with those in the need to know. Recognizing this heightened need for standards that fall in line with today’s sharing capabilities, the Department of Homeland Security last month instituted a whole new set of standards for sharing classified information.

According to Nick Hoover in InformationWeek, “The directive names officials who will be responsible for the oversight of classified information sharing, and sets standards for security clearance, physical security, data security, classification management, security training, and contracting. These standards will apply both in government and in the private sector.”

The setting of these standards comes more than a full year after they were supposed to have been put in place (the original deadline had been February 2011) and many months after the Pentagon reported the loss of 24,000 files at a Department of Defense contractor as the result of a cyber attack initiated by a foreign government. Nevertheless, they are in place and are a step in the right direction…but there are other standards that still should be instituted before the government can truly say their IT system is safe.

Standard Bearers

In addition to implementing a set of standards for how information is shared, the government needs to look at implementing a set of standards for the technology behind that information. Optimal software performance – from security to dependability to ease of use – depends upon application software living up to an appropriate set of standards for the day across all facets of software health. Organizations – both public and private – need to ensure that the application software that comprises their IT systems is sound in each of the software health factors, including security, robustness, transferability, changeability and performance.

As we know from the CAST Report on Application Software Health (CRASH) released in December, government applications score the lowest of any industry when it comes to transferability – the ease with which software can be used by another agency, which in government should be a necessity – and about middle-of-the-pack in total quality of its applications. As the country’s largest employer, not to mention one of the world’s largest targets for cyber attacks, the Federal government needs to have higher standard for its application software.

Failing to optimize the overall health of its IT systems may continue to prove costly for the Feds. A set of standards for software health need to be set and the applications within the Fed’s IT system assessed against those standards to identify where issues exist. Failing to at least identify where the problems lie mean the Fed would remain vulnerable to attacks like the one the DoD admitted to last year.

When it comes to meeting standards for application software quality, “good enough for government work” should not and cannot be good enough.

 

Filed in: Software Quality
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom Writer, Blogger & PR Consultant
Jonathan is an experienced writer with over 20 years writing about the Technology industry. Jon has written more than 750 journal and magazine articles, blogs and other materials that have been published throughout the U.S. and Canada. He has expertise in a wide range of subjects within the IT industry including software development, enterprise software, mobile, database, security, BI, SaaS/Cloud, Health Care IT and Sustainable Technology. In his free time, Jon enjoys attending sporting events, cooking, studying American history and listening to Bruce Springsteen music.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|