Government agencies are working hard to enact sweeping IT modernization and security efforts in tandem. The Department of Agriculture is blazing the trail on complying with the White House’s Executive Order on federal network security. And while their efforts don’t grab headlines quite as easily as Facebook’s latest privacy travails or the ongoing Russia investigation, they’re just as critical for protecting agencies’ network infrastructures from data breaches.
In the effort to modernize and secure their systems, agencies will encounter a slew of challenges. Specifically, agencies must be making more informed risk-based procurement decisions for IT, “which they currently don’t have a consistent means of doing,” according to Jeanette Manfra, an official at the Department of Homeland Security’s Office of Cybersecurity and Communications. Manfra spoke recently on this topic at the CISQ Cyber Resilience Summit, held on March 20.
Manfra and several other speakers at the summit offered some timely insights on how government agencies can reinvent their software to be more secure than ever, without sacrificing innovation.
Get Ready for a New “Ride”
“Security and IT modernization go hand in hand,” says Manfra. This means that for Federal agencies, modernization projects are already imbued with a higher level of data security than the legacy systems they’re replacing. It’s similar to buying a new car to replace your turn-of-the-century junker: you’re all but guaranteed better fuel efficiency, a host of digitized engine functions and monitors, and conveniences like satellite radio and heated/cooled seats—features that just weren’t available the last time you were car shopping.
Likewise, agencies should realize that new software and networking products are a radical departure from their legacy ancestors: they’re better and faster, come with a sexier user interface–and they’re also going to be safer by default, since their core components are already more secure.
Explore Quality Standards
Another great resource that agency CIOs can leverage are software quality standards—measurements of software health, including reliability, robustness, efficiency, changeability, transferability, and of course security. CISQ has been instrumental in creating these industry standards, which vendors can use as a baseline for their automated assessments. These standards are the first step in creating a process for identifying and correcting software quality errors that can impede software health, compromise vendors’ output, and ultimately create security concerns.
Embrace the Three Ps
So far, I’ve cited inherent technology advances and the emergence of solid software health standards and measurement techniques as developments contributing to agencies’ secure migrations from legacy systems. But there’s more, as neatly summed up by Sanjeev “Sonny” Bhagowalia, a Senior Advisor on Technology and Cybersecurity in the U.S. Department of the Treasury, who also spoke at the Summit.
Bhagowalia cites “people, policy, process, and technology” as essentials to improved data security in any venue, but particularly government agency systems. It’s not just enough to modernize systems in a security context, according to Bhagowalia. For example, you can modernize your agency’s means of managing personnel information. But if your employees are in the habit of leaving their laptops or tablets out in the courtyard, then even the most secure software in the world may not be secure at all if it can fall into the wrong hands. (And yes, those who failed to enact policies in the first place are even more at fault.)
Focus on the Core
Last but certainly not least among tactics for better Federal IT security is for agencies to keep their focus on the most vulnerable parts of the tech operation. For example, most security spending goes to securing the perimeter of systems by modernizing the apps that users touch and feel everyday. However, the most devastating security attacks are going to occur at the core of an application—buried underneath the user-interface layer, in either the business or data layer.
When you neglect these core layers, it’s like obsessing about gaining five pounds, while you’ve been having severe abdominal pain: so while you plan your carrots-only for the next month or two, you’re being compromised by colon cancer. The lesson here: the components that users may interact with only indirectly often hold the riskiest flaws.
As government agencies work hard to retire their legacy systems and address significant security concerns, they can mitigate risks in several ways, including the use of software health assessments and project management tools. By doing so, they’re all but insuring that they stay out of the headlines—at least for negative reasons.
Erik Oltmans, an Associate Partner from EY, Netherlands, spoke at the Software Intelligence Forum on how the consulting behemoth uses Software Intelligence in its Transaction Advisory services.
Erik describes the changing landscape of M & A. Besides the financial and commercial aspects, PE firms now equally value technical assessments, especially for targets with significant software assets. He goes on to detail how CAST Highlight makes these assessments possible with limited access to the targetâ€™s systems, customized quality metrics, and liability implications of open source components - all three that are critical for an M&A due diligence.