Phil Tetlock and Dan Gardner on “Better Learning Through Better Betting”

“Ideally, a bet would use a question as big as the debate it means to settle. But that will not work, because big questions – “Will population growth outstrip resources and threaten civilization?” – do not produce easily measurable outcomes. The key, instead, is to ask many small, precise questions…. This approach, using question clusters, could be applied to virtually any important debate. Right now, for example we are putting the hawks-versus-doves argument about the Iran nuclear deal to the forecasting test.

“Naturally, using many questions could result in split decisions. But if our goal is to learn, that is a feature, not a bug. A split decision would suggest that neither bettor’s understanding of reality is perfectly accurate and that the truth lies somewhere between. That would be an enlightening result particularly when public debates are dominated by extreme positions.”

Phil Tetlock and Dan Gardner, May 11, 2016, Project Syndicate

Join the “Electric Vehicle Tipping Point” Challenge at GJ Open

Enthusiasm for electric vehicles has historically followed a boom and bust cycle. With the emergence of electric vehicle startups like Tesla, the introduction of global electric models like Nissan Leaf, declining battery costs, government subsidies, and public/private infrastructure investments, it is a good time to ask: Are we on the cusp of an electric vehicle “tipping point” ? Continue reading

Paul Schoemaker and Phil Tetlock in the
Harvard Business Review on “Superforecasting:
How to Upgrade Your Company’s Judgment”

“The sweet spot that companies should focus on is forecasts for which some data, logic, and analysis can be used but seasoned judgment and careful questioning also play key roles. … On the basis of our research and consulting experience, we have identified a set of practices that leaders can apply to improve their firms’ judgment in this middle ground. Our recommendations focus on improving individuals’ forecasting ability through training; using teams to boost accuracy; and tracking prediction performance and providing rapid feedback. The general approaches we describe should of course be tailored to each organization and evolve as the firm learns what works in which circumstances.”

Paul J. H. Schoemaker and Philip E. Tetlock, Harvard Business Review, May 2016

The Wharton Journal on “A book review and a love letter”

“Written by Penn’s own genius prognosticator Phil Tetlock, the work is chock full of lessons on prediction…. Of Tetlock’s thousands of novices, 2% stuck out as highly and consistently accurate. He deems these people ‘superforecasters’, and attempts to glean lessons from them. What he learns then recommends seems intuitive – break down an issue into discrete parts, think probabilistically, minimize internal biases and utilize data and logic, keep a record or your predictions and review for lessons learned, etc.”

Matt McGuire, The Wharton Journal, April 13, 2016

“Journal of Strategic Security” Reviews Superforecasting

Superforecasting … is very much worth reading by intelligence professionals, or by anyone in any field who is interested in understanding how to look into the future. It should be required reading for all students interested in intelligence, and for professors who teach intelligence – they should update their courses to reflect these new insights.”

Edward M Roche, Journal of Strategic Security 9:1 (2016)

Barry Ritholtz’s Masters in Business: Philip Tetlock Interview

“As it turns out, there are ways to improve your ability to make probabilistic guesses about the future. The author discovered that ‘Teams of ordinary forecasters beat the wisdom of the crowd by about 10%; Prediction markets beat ordinary teams by about 20%; And superteams beat prediction markets by 15% to 30%.’ This leads to numerous insights into ways to improve your forecasting skills. Tetlock notes that ‘the more degrees of uncertainty you can distinguish, the better a forecaster you are likely to be.'”

Barry Ritholtz, Masters in Business, March 26, 2016

Penn Spotlights on “Penn-trained ‘Superforecasters’
Outpredict Pundits in 2016 Elections”

“According to Professor Philip Tetlock, the quality of the 2016 presidential debates have been disappointing to say the least, but could be improved by applying the methods used in forecasting tournaments—where predictions are scored against actual outcomes…. ‘The forecasting tournament’s scientific approach and probability estimation would be a more civilized way to figure out whose policies are going to lead to which consequences.’”

Christina Cook, Penn Spotlights, March 22, 2016

REIT Magazine on “Phil Tetlock Explains the Art and Science of Superforecasting”

“The superforeforecasters are professionals at making probability estimates in an uncertain world. As a firm, that’s Good Judgment’s specialty. Running a successful portfolio requires many other specialized skills, so at this point we are partnering with firms who have those skills and recognize the value that Good Judgment offers to their investment processes.”

Allen Kenney, REIT Magazine, March 21, 2016

2016 Axiom Business Book Awards:
Superforecasting Wins Gold for Business Theory

“The Axiom Business Book Awards are intended to bring increased recognition to exemplary business books and their creators, with the understanding that business people are an information-hungry segment of the population, eager to learn about great new books that will inspire them and help them improve their careers and businesses.”

Axiom, February 17, 2016

Parsing Techniques: Layered Realm Analysis

by Ari, Superforecaster

Superforecasters sometimes sift through terabytes of data and gobble gigabytes of printed material in the process of formulating accurate probability estimates. When we discover experts offering timely, authoritative advice to the public on a topic relevant to our forecasts, we pay special attention. Sometimes we parse every word and every number they offer in support of their statements, arguments, and policy positions in order to determine how heavily to weight them in our forecast analyses.

Recently, Dr. Larry Summers offered his prognosis of the current global economic recovery and then prescribed a remedy that he extracted from an economic theory developed in the 1930s. What follows is an example of an experimental parsing technique that I’ve dubbed Layered Realm Analysis (LRA) and applied to Dr. Summers’ essay. Continue reading

NPR 13.7: Cosmos and Culture on
“Want To Make Better Predictions?”

“Many of the characteristics that set apart superforecasters are unsurprising. For instance, they tend to have high levels of fluid intelligence (the ability to engage in abstract reasoning and problem solving), and to be well-informed about the relevant domain: world politics. They also put in the hours to learn and think carefully about each prediction, drawing on multiple sources of information. But the superforecasters additionally distinguished themselves in more subtle ways.”
Tania Lombrozo, NPR 13.7: Cosmos and Culture, February 23, 2016

On the Media on “The Psychology of Predictions”

“The 2016 election season is awash in bad political prognostications…just like every other election season. But if pundits are always so wrong, why do we keep listening? And more importantly, why do the media keep airing them? Philip Tetlock is a professor of psychology and management at the University of Pennsylvania, and has been studying political prognosticators for decades, first in his book Expert Political Judgment, and recently in Superforecasting. (He also runs the Good Judgment project.) Tetlock talks with Bob about good forecasters and bad forecasters and why the media encourages poor punditry.”

Bob Garfield, On the Media, March 4, 2016

The Telegraph on “Beware: the experts are usually poor forecasters”

“[I]n his latest book, Superforecasting: The Art and Science of Prediction, co-authored with Dan Gardner, Tetlock recounts how he has discovered a small category of superior analysts that stand out from the mediocre average… Tetlock’s book is of fundamental importance and should be read by everybody in economics, business, finance and politics who want to improve their ability to understand and predict trends and developments.”

Allister Heath, The Telegraph, March 3, 2016

Forecasting Case Study: Ghana’s Oil Revenues

by Samuel Bekoe and David Mihalyi

In order to help Ghanaian MPs and the general public understand the potential impact of volatile petroleum prices on the implementation of the 2015 budget, we built an oil revenue forecasting model during the budget debates in 2014. We found that some of the techniques familiar to superforecasters improved our forecasting process, including the importance of regular belief updating, finding the right base rate, and conducting a rigorous post-mortem to assess lessons learned. We offer our experience as a case study in forecasting to help bring public scrutiny to budget forecasting in developing countries.

Continue reading