By definition, standards are supposed to be a set of bare minimum requirements for meeting levels of acceptability. In school, the students who took the “standard” level courses were those who were performing “at grade level” and just focused on graduating. Every April in the United States we need to decide whether we will take the “standard deduction” – the bare minimum we can claim for our life’s expenses – or do we have enough to itemize our living expenses and therefore deduce more from our base income before taxes.
In other words, standards are the vanilla ice cream of business requirements.
When it comes to Technology, standards are no different. They still represent baseline requirements for quality. What is different in Technology, however, is that the elements that make up those standards expand and grow beyond the parameters that the original formulators of standards could ever have imagined.
Even Bill Gates is alleged to have said in 1981 that, “640K software is all the memory anybody would ever need on a computer.” He has since vehemently denied ever having said that, but the point is that at the dawn of the computer age in 1981, nobody could have foreseen the need for megabytes of memory as standard on a computer, let alone gigabytes or even terabytes.
But as computer capabilities increase, so do the standards.
There was a time when the government standard for sharing information was really quite simple. Whether you called it “Loose lips sink ships” or just “Keep your mouth shut” it was a low-tech standard for low tech times.
With information sharing hurdling into cyberspace, however, it now takes more than someone’s silence to ensure that information is shared only with those in the need to know. Recognizing this heightened need for standards that fall in line with today’s sharing capabilities, the Department of Homeland Security last month instituted a whole new set of standards for sharing classified information.
According to Nick Hoover in InformationWeek, “The directive names officials who will be responsible for the oversight of classified information sharing, and sets standards for security clearance, physical security, data security, classification management, security training, and contracting. These standards will apply both in government and in the private sector.”
The setting of these standards comes more than a full year after they were supposed to have been put in place (the original deadline had been February 2011) and many months after the Pentagon reported the loss of 24,000 files at a Department of Defense contractor as the result of a cyber attack initiated by a foreign government. Nevertheless, they are in place and are a step in the right direction…but there are other standards that still should be instituted before the government can truly say their IT system is safe.
In addition to implementing a set of standards for how information is shared, the government needs to look at implementing a set of standards for the technology behind that information. Optimal software performance – from security to dependability to ease of use – depends upon application software living up to an appropriate set of standards for the day across all facets of software health. Organizations – both public and private – need to ensure that the application software that comprises their IT systems is sound in each of the software health factors, including security, robustness, transferability, changeability and performance.
As we know from the CAST Report on Application Software Health (CRASH) released in December, government applications score the lowest of any industry when it comes to transferability – the ease with which software can be used by another agency, which in government should be a necessity – and about middle-of-the-pack in total quality of its applications. As the country’s largest employer, not to mention one of the world’s largest targets for cyber attacks, the Federal government needs to have higher standard for its application software.
Failing to optimize the overall health of its IT systems may continue to prove costly for the Feds. A set of standards for software health need to be set and the applications within the Fed’s IT system assessed against those standards to identify where issues exist. Failing to at least identify where the problems lie mean the Fed would remain vulnerable to attacks like the one the DoD admitted to last year.
When it comes to meeting standards for application software quality, “good enough for government work” should not and cannot be good enough.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.