Do You Feel Lucky? Well, Do Ya?


Clint Eastwood said it and Pete Peterson's life story is apparently filled with it, but can luck be quantified?

Yes, according to two Business professors, Robert Connolly of the University of North Carolina and Richard Rendleman of Dartmouth. Their work is the subject of a short article in a recent Wall Street Journal article.

By using cubic spline functions on data from every PGA tournament from 1998 to 2001, they've been able to separate out the roles played by skill and by luck in golf.

Based on their technique, they can tell you the score that any golfer's intrinsic skill alone will produce. Scoring below that is due to good luck; above that, and you're having bad luck. The amount of luck is measured directly in the number of strokes over or under your skill-based score.

That's fascinating! It got me thinking about other data sets that this type of analysis might be applied to. We all "play" in arenas where we don't control everything -- and often we don't even know the relevant myriad variables. This technique might be a way to separate out what we do control from those exogenous factors that influence outcomes.

Some areas where this technique might come in handy:

* Strategic business decisions * Stock picking or portfolio balancing * The number of gold medals a country ends up with in the next Olympics * The capacity of a company's IT network * The number of business disruptions a mission-critical IT system causes

and many many more. In general, these are all cases of complex systems with a multitude of interconnected variables of which any player can only influence a subset.

Once we separate out the skill (or controllable) part from the luck part, what do we do? Well, at least now we can make a more informed decision about how much slack to build in to the business case or contract, or how much insurance to buy.

Luck may favor the well prepared; but even when it doesn't, you'll now be prepared.

Filed in: Technical Debt
Tagged: golf luck quality
Get the Pulse Newsletter  Sign up for the latest Software Intelligence news Subscribe Now <>
Open source is part of almost every software capability we use today. At the  very least libraries, frameworks or databases that get used in mission critical  IT systems. In some cases entire systems being build on top of open source  foundations. Since we have been benchmarking IT software for years, we thought  we would set our sights on some of the most commonly used open source software  (OSS) projects. Software Intelligence Report <> Papers
In our 29-criteria evaluation of the static application security testing (SAST)  market, we identified the 10 most significant vendors — CAST, CA Veracode,  Checkmarx, IBM, Micro Focus, Parasoft, Rogue Wave Software, SiteLock,  SonarSource, and Synopsys — and researched, analyzed, and scored them. This  report shows how each measures up and helps security professionals make the  right choice. Forrester Wave: Static Application Security Testing, Q4 2017  Analyst Paper
Making sense of cloud transitions for financial and telecoms firms Cloud  migration 2.0: shifting priorities for application modernization in 2019  Research Report
Load more reviews
Thank you for the review! Your review must be approved first
New code

You've already submitted a review for this item