False Positive in security – Why We Like to Cry Wolf

by

On a recent morning, I was settling into a meeting with a Gartner application security analyst in our conference room. Suddenly the fire alarm went off, only to be followed by the familiar announcement from the building fire marshal: “This is only a test. Please ignore the fire alarm.” I turned to our analyst guest and said “funny, that’s exactly what I was planning to do.” This turned out to be a perfect segue to our ensuing discussion about our approach to avoiding a false positive deluge in application security findings.

False Positive in security – The unintended consequences

This reminded me of the CISQ Cyber Resilience Summit I attended a few weeks back. Tony Scott, the Federal CIO under President Obama wrapped up that meeting with some insightful remarks. One of the points he conveyed, speaking right after a panel of regulators, is a story about false positives in his home state of California. Apparently buildings that have asbestos in their building materials have to display a plaque with a warning to the same effect. Tony then proceeded to tell us that almost every building in the Golden State carries this plaque. Aside from living in the tent cities set up in some parts of the state, residents have no choice but to ignore these false positives.   

false-positive-warning

Now, our office happens to be in Manhattan, where we are either fortunate or blissfully unaware of our asbestos exposure. So no asbestos warnings, but we have a different problem. With so many multi-story buildings in New York City, fire safety is a real concern here. I’ve worked in Manhattan for a long time and ever since I can remember, the local ordinances are such that require building management to run regular fire drills and fire alarm tests. Between those and just unfortunate false alarms, the fire alarm in our building goes off about once or twice per month on average. Take a wild guess at the effect that has on the building tenants? You guessed it. We stop noticing. It’s another false positive. We don’t care. We have become conditioned to ignore the alarm.

In my humble opinion, this is dangerous. It means we have no effective fire alarm in our office building. No early warning! For those of us who’ve worked there for a while it’s even worse, because it means we won’t have any fire alarm in any building for the foreseeable future – until this Pavlovian response fades into memory.

The situations in NYC and California are both examples of a well-intentioned process that miserably fails in practice, with dangerous side effects. We see the same with many of our clients when it comes to application security. The security team and the compliance policies in the enterprise force as many software scans as possible, with no regard for accuracy. With the advent of DevSecOps, we’re automating these scans into our tool chains and the IDE. That way we can rain findings upon our developer colleagues every day, every hour, every time they type in a line of code. We work furiously to perform these analyses more quickly – in minutes or less – so that findings can be delivered to the developer ASAP.

[Suggested Reading: False positives in SAM -- Achilles’ heel or Samson's hair?]


False Positive in security – How to reduce

Well, guess what. The only way to get an analysis that’s accurate is to assess the whole system, to resolve all the complex dependencies, to map out the architecture. That kind of system level analysis can be automated using advanced software intelligence tech, like CAST. But, it might run an hour, rather than a minute. And because it takes longer than a coffee refill in the office kitchen, it’s often deemed unacceptable. Quick and dirty is better. More “good enough” findings, rather than fewer-yet-accurate findings.

You reap what you sow. Just like our office fire alarm, developers become conditioned that over 50% of these findings are wrong. False alarms. Another meaningless false positive. So, the findings get ignored. The well-intentioned app security policy becomes a policy of ignoring app security. And this too can be dangerous.

[Suggested Reading: Juliet and OWASP Benchmark Results: How CAST Tests Against 2 Most Important Application Security Standards in 2019]


Why is it that safety and security authorities want to cry wolf? Whether it’s health safety, fire safety or app sec safety. I believe it’s a natural reflex that we have to fight against. With our overabundance of findings we can’t point a finger at the security officer and blame them for missing something when that inevitable incident occurs. They are doing their job if they reported a potential breach, even if it was buried in a mountain of false positives. They get to be on the right side of a breach, or a fire. They did as much as they could do to help us protect ourselves. How can we blame the AppSec team that the silly developer didn't fix a flaw somewhere in a list of 100s of false positives? Nobody will blame the fire department for conditioning us to ignore the fire alarm. It’s too farfetched.

[Additional Reading: Reduce False Positives in Application Security Testing]


I hope that we will come to our senses about true safety measures. I hope we consider reducing false positive in security and replacing quantity of alarms with quality of findings. In the cyber world, as in the physical safety world, the stakes are just too high for us to continue the same way.

Interested in learning more about reducing false positive in security? Connect with our security expert to get a demo of the Forrester rated most accurate Static application security platform.

Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
Making sense of cloud transitions for financial and telecoms firms Cloud  migration 2.0: shifting priorities for application modernization in 2019  Research Report
Lev Lesokhin
Lev Lesokhin EVP, Strategy and Analytics at CAST
Lev spends his time investigating and communicating ways that software analysis and measurement can improve the lives of apps dev professionals. He is always ready to listen to customer feedback and to hear from IT practitioners about their software development and management challenges. Lev helps set market & product strategy for CAST and occasionally writes about his perspective on business technology in this blog and other media.
Load more reviews
Thank you for the review! Your review must be approved first
Rating
New code

You've already submitted a review for this item

|