토요일, 10월 19, 2024
HomeEconomyWhat we can and can’t say about what we do and don’t...

What we can and can’t say about what we do and don’t kn…


Earlier in the summer, the Democratic party and its supporters faced a difficult decision: should they gently but firmly sideline President Joe Biden from the 2024 ticket? There were plenty of reasons to agonise over the decision: loyalty to Biden; the daunting practicalities of the switch; fear of the chaos that might ensue; nervousness that the likely replacement, Biden’s vice-president Kamala Harris, wasn’t up to the job.

But such nerve-racking judgment calls are meat and drink to the likes of Nate Silver, author, poker player and widely admired forecaster of election results. In his new book On The Edge, Silver examines and admires the culture of what he calls “The River” — people who think probabilistically, are happy to be contrarian and have a high tolerance for taking risks.

For a Riverian such as Silver, the decision was simple. There was plenty of data from opinion polls indicating that Biden was likely to lose the election. The same data suggested that most plausible replacements, including Harris, would do better. Yes, there was some risk in ejecting Biden, but overall it was a bet worth making. That’s how the world looks to a Riverian. (Most politicians are not Riverians but, notably, many of the financiers and entrepreneurs who fund political campaigns are.)

There is much to be said for thinking probabilistically and for being willing to take reasonable risks. Yet, as John Maynard Keynes wrote in 1937, some things — like “the prospect of a European war . . . or the price of copper and the rate of interest 20 years hence” — are profoundly uncertain. “About these matters there is no scientific basis on which to form any calculable probability whatsoever. We simply do not know.”

One could disagree with that, as statistician David Spiegelhalter does in his forthcoming book The Art of Uncertainty. Surely when assessing the chances of war in 1937, one could do more than merely shrug. Still, Keynes was on to something. A political forecaster can look at the opinion polls. A poker player can calculate the odds that the next card revealed will be an ace. But the geopolitical analyst can do no better than make an educated guess.

Should we then refuse to dignify deep uncertainty with a confected probability estimate? There is something to be said for avoiding quantification: scenario planners often view probability estimates as a distraction. Would Hitler invade Poland? Rather than ask “How likely is that to happen?” it might be more fertile to ask “What would we do if he did?”. For contingency planning, and for trying to broaden our understanding of what might be possible, that is not such a bad approach.

Yet there are dangers in shying away from a guess at probabilities. For the psychologist Phil Tetlock, famous for his research into “superforecasters”, vague verbiage lets fortune-tellers off the hook. A statement such as “Keir Starmer may find the path ahead will bring unexpected challenges” might sound insightful until you reflect for a minute.

Another risk is that words such as “likely” or “common” do not convey what we think they do. When you are told constipation is a “common” side-effect of statins, what does that suggest to you? As Spiegelhalter explains, a survey of patients reckoned “common” meant about a third of the time, but to regulators in the UK and the EU a “common” side-effect happens between 1 and 10 per cent of the time.

These ambiguities can have serious consequences. In 1961, the US joint chiefs of staff reckoned there was a 30 per cent chance that an invasion of Cuba by exiles, supported by the US, would topple Fidel Castro’s regime. In a report to the president, this number was translated into “a fair chance”. But “a fair chance” could mean anything. President Kennedy gave his approval to what became the Bay of Pigs fiasco, thinking his advisers were confident of success. They should have stuck to the numbers after all.

Sometimes we have a good idea of the risks we face, and sometimes we don’t have a clue. Sometimes trying to think through the probabilities is a clarifying exercise, and sometimes it offers nothing more than false reassurance. So what to do? Spiegelhalter admiringly describes the five-stage approach of zoologist John Krebs, who as chair of the Food Standards Agency had to deal with the BSE crisis. The five steps are: tell people what you know, what you don’t know, what you’re doing to find out, and what they can do in the meantime. Finally, remind them that the advice may change.

These are solid principles for communicating in an uncertain situation. But they are also a strong starting point for thinking rigorously in an uncertain world. We should all be asking ourselves what we know, what we don’t and how we plan to fill the gap in our knowledge.

This, perhaps, suggests a gap in the risk-taking Riverian viewpoint. The word “experiment” does not make it into Silver’s index. For a poker player that may make sense: in a game of poker the only way to find out is to bet. Much the same is true if you’re planning an invasion of Cuba. But often, intelligent experiments can resolve uncertainty at minimal cost.

Often, but not always. Silver and Spiegelhalter would both call themselves Bayesians, a word Silver defines as “comfortable quantifying . . . intuitions and working with incomplete information”. But if Bayesian rings a bell, it is also the name of Mike Lynch’s luxury yacht, which so shockingly sank last month. Some risks blindside us all.

Written for and first published in the Financial Times on 6 September 2024.

Loyal readers might enjoy the book that started it all, The Undercover Economist.

I’ve set up a storefront on Bookshop in the United States and the United Kingdom. Links to Bookshop and Amazon may generate referral fees.



Source link

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments