Rumsfeld on Software – Handling Unknown Unknowns


While former Secretary of Defense Donald Rumsfeld never spoke or wrote about software (as far as I know), his quip about unknown unknowns during the early months of the Iraq war is well known.

No matter what you think of Rumsfeld, his classification applies nicely to software and teaches us a lesson or two about building good software.

Rumsfeld's Classification Applied to Software

Some things you can test for right away. Some things you can anticipate and set aside to test for later. But the stuff in the top right in red is impossible to test for and not easy to plan for either. How an application and its environment will change is quite uncertain.

How do you handle this uncertainty?

By starting with static analysis, but not stopping there. You have to go beyond static analysis in five ways:

  1. Analyze and measure the application as a whole not just its component parts in isolation. This means going wide on technology coverage -- not just a plethora of languages, but being able to handle frameworks and databases. It means putting your measurements in the context of the application as a whole, not just parts of it.
  2. Generate a detailed architectural view that can be readily updated. This gives you the visibility to see what's changing.
  3. Make sophisticated checks of patterns and anti-patterns in software engineering to catch design and bad-fix problems that are otherwise impossible to find and eradicate.
  4. Provide actionable metrics that gives IT teams a sense of what to change (and in what sequence) to improve quality.
  5. Automate, automate, automate! If you do 1 through 4 above, you would then be automating design and code reviews -- provably known to be the most effective insurance against unknown unknowns.
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
Making sense of cloud transitions for financial and telecoms firms Cloud  migration 2.0: shifting priorities for application modernization in 2019  Research Report
Load more reviews
Thank you for the review! Your review must be approved first
New code

You've already submitted a review for this item