Fed Should Budget for Technical Debt

by

It’s a presidential election year in the U.S. That means lots of attention being paid to people saying what they think they want us to hear in order to secure election to office. It also means the standard operations of government tend to fade into the background.

Take the Federal budget debate. Most years it would be forefront material, particularly in a year when Congress vowed to make significant cuts to the budget in order to reduce the deficit. With election news grabbing the spotlight every night, though, preliminary discussions have generated very little news.

One item that has been brought up, however, is the proposal to cut a portion of the Federal government’s IT budget. As reported by Nick Hoover in InformationWeek recently, President Obama’s preliminary fiscal budget calls for a relatively slight trimming of the Federal IT budget – only 1.2%. When taken in real dollars, however, that translates to $900 million in cuts to the Federal IT budget. Of that sum, two-thirds of which will come from the IT budget of the Department of Defense…this one year after announcing the largest cyber theft of sensitive material in the history of the DoD.

In offering its reasoning for the cuts to the budget, federal CIO Steven VanRoekel said that as much as $300 million will be satisfied through data center consolidation. He reasons that by centralizing where information is stored the government will require less hardware, less space and fewer personnel to house the data on which it runs. But this raises the question from where will the remaining $600 million come?

An Inconvenient Truth

Obviously, the cuts to the Federal IT budget will need to go deeper than those data center consolidation can afford. Unfortunately, as with any business, when you start cutting funds from budgets it inevitably results in reductions in quality.

As evidenced last year, however, the U.S. government has become a persistent target for cyber terrorism. The cyber theft of 24,000 sensitive files from a defense contractor, the July 4th cyber attack on Department of Energy contractor Pacific Northwest National Laboratory and the infecting of a U.S. Air Force drone by a computer virus all illustrate the dire need to bolster the quality of application software in all facets of government.

By the admission of outgoing Fed CIO Vivek Kundra, the government already has a problem following through on IT projects, which can have an adverse affect on quality. Cutting the funding for Federal IT could further exacerbate this issue. For these cuts to the IT budget to come at a time when assurance of quality is so vital could leave the government more even susceptible to a cyber attack.

Making Quality Self-Evident

In a rather coincidental twist, the answer for both how to cut the Federal IT budget further and address both security and other application software quality issues may lie in the same effort – the effort to cut or at least control technical debt.

As we’ve discussed in many posts here, technical debt represents the maintenance costs to repair issues with application software that occur after deployment – issues that perhaps could have been detected prior to deployment. By bringing technical debt into check, funds that otherwise would have been spent fixing problems could instead be directed to becoming more innovative in protecting the government against cyber terrorism.

Not only would bringing technical debt into check trim the Federal IT budget, it would also mean that money spent on IT matters could be used more efficiently.

But how can government achieve such efficiencies and address technical debt?

The government needs to pay greater attention to the structural quality of its application software and must do so in a highly efficient manner. This means that all entities that work with the aspects of Federal IT – both in-house and outsourced – need to employ some form of automated structural analysis of their systems to detect issues before they result in breach points and outages.

Filed in:
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
This study by CAST reveals potential reasons for poor software quality that  puts businesses at risk, including clashes with management and little  understanding of system architecture. What Motivates Today’s Top Performing  Developers Survey
Jonathan Bloom
Jonathan Bloom Technology Writer & Consultant
Jonathan Bloom has been a technology writer and consultant for over 20 years. During his career, Jon has written thousands of journal and magazine articles, blogs and other materials addressing various topics within the IT sector, including software development, enterprise software, mobile, database, security, BI, SaaS/cloud, Health Care IT and Sustainable Technology.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|