Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock and Dan Gardner (Crown)
According to the old joke, the average expert is roughly as accurate as a dart-throwing chimpanzee. It’s a joke based on research by Philip Tetlock, the scientist whose prior work revealed that the predictions of academics, pundits and other experts about the economy, stocks, elections, wars and other issues were no better than random guessing. Ruminating on that data, Tetlock became intrigued by one discrepancy in his work: while many experts had done little better than guessing, some had done very well indeed. Why was that, he wondered?
In his new book, Superforecasting: the Art and Science of Predicting, Tetlock (along with co-author and journalist, Dan Gardner) shows how he cracked the code using a multi-year forecasting study that asked thousands of volunteers, engineers, artists, lawyers, scientists and a random assortment of others to forecast world events. What they found was that a subset of these individuals were remarkably reliable in their accuracy—so accurate that a team that included a filmmaker, a retired bird-watcher and a mathematician consistently beat a handpicked team of crack national security analysts in a series of forecasting competitions. Tetlock calls such individuals “superforecasters” and his book describes what makes them so good and how others can learn to do what they do.
Forecast, measure, revise, repeat
Turn on cable news and chances are you’ll catch a pundit or pollster painting their respective scenarios for how a certain issue will play out. The same is true with business. Executive prognosticators of various stripes are paid substantial fees to speak at marquee events and counsel boards. And there’s a reason for that. We love experts and forecasts, but as Tetlock notes, we seldom do the Monday morning quarterbacking to see just how good their predictions are.
Superforecasters, by contrast, are notable for their zealous embrace of continual self-improvement. Test, measure, iterate and retest. Those qualities underlie the best predictions. Tetlock and Gardner explain, “…Superforecasting demands thinking that is open-minded, careful, curious, and—above all—self-critical… Only the determined can deliver it reasonably consistently, which is why our analyses have consistently found commitment to self-improvement to be the strongest predictor of performance.”
Fortunately, the qualities that give superforecasters their edge can be learned. And one doesn’t need to be a math major or economist to see markedly improved results. Tetlock distills these qualities into “Ten Commandments.” Among them are common sense precepts, such as “focus on questions where your hard work is likely to pay off.” He also says to break problems into sub-parts and separate the knowable and unknowable parts, calibrate your assumptions and make probability your friend. “The surprise,” says Tetlock, “is how often remarkably good probability estimates arise from a remarkably crude series of assumptions and guesstimates.”
Superforecasters are also keenly attuned to nuance. Every set of assumptions comes with a degree of uncertainty. Superforecasters tease out those areas of uncertainty and assign them weights based on their relative probability. That ability to think more granularly about such unknowns takes patience and application, but ultimately, Tetlock argues, everyone can improve his or her predictive abilities, by maintaining an open-mind and digging into one’s analyses consistently and meticulously, sleeves-rolled up, with a view to continually measuring and refining one’s approach.
Major takeaways by role:
Insist on evidence-based forecasting. Have managers set specific performance metrics for departmental and business line budgeting and forecasting to validate assumptions and minimize subjective guesswork.
Fill your team with broad thinkers. Look for individuals with strong research skills, who can create and weigh scenarios objectively and with the mental discipline and humility to continually scrutinize their analyses and hold them out for examination.
Quality forecasts underpin finance and IT functions. Create algorithms to help focus IT and finance teams on complex, subjective analyses requiring sophisticated judgment and reasoning.
Integrate data science teams into the marketing function to raise the level of analytics talent and improve campaign outcomes. Make “try, fail, analyze, adjust and try again” a mantra.