October 28, 2021

I, Science

The science magazine of Imperial College

Everyone can—and should—learn to deal with risk and uncertainty for themselves says Gerg Gigerenzer ...

This review was published in the Reviews section of the Super Science issue (issue 28), as a prominent event review of the past few months.

risk savvy pic-f

Remember the volcanic ash cloud over Iceland? The subprime disaster? Or mad cow disease? Every crisis makes us worry; until we forget it and start worrying about the next one. When something goes wrong, we are told that the way to prevent a new crisis is through better technology, bigger bureaucracy, and stricter laws.

Gerg Gigerenzer, Director of the Center for Adaptive Behaviour and Cognition at the Max Planck Institute for Human Development, believes there is an alternative solution: risk savvy citizens. During his talk at the Royal Institution, Gigerenzer argued that the experts we turn to in times of risk are often part of the problem because they don’t understand risk themselves, they cannot communicate it in an understandable way, or possibly that they have interests not aligned with ours. In a technological society, everyone can—and should—learn to deal with risk and uncertainty for themselves.

A good starting point is to understand the difference between absolute and relative risk. Gigerenzer used the example of the contraceptive pill scare in the UK. In 1995, a study suggested that third generation pills increase the risk of thrombosis two-fold, that is, by 100%. This is what’s known as relative risk. But no one cared to explain to patients what this doubling meant in absolute terms: the number of women who suffered thrombosis increased from one woman to two out of every 7000. Confronted with the scarier-sounding relative risk increase, many British women panicked and got off the pill, leading to a 9% rise in abortions in England and Wales the following year.

The second idea to keep in mind that risk and uncertainty are not the same thing. Transportation statistics show that in the 12 months following 9/11, the miles driven during long distance journeys in the US increased by 5%. In order to avoid flying, an estimated 1,600 people above the average died on the roads. People knew that the risk of driving remained the same. What they didn’t know was whether there could be another terrorist attack. This is what’s called uncertainty.

When all the alternatives, probabilities, and consequences are known, we are dealing with risk. Risk requires logic and statistics. By contrast, in the world of uncertainty not every factor can, or will, be known. So decisions to questions such as ‘Who to marry? Who can you trust? Where to invest your money?’ require intuition and rules of thumb known as heuristics, not necessarily hard logic and stats.

The problem in our society is that statistics are placed above intuition and heuristics. In large companies we see this quite often. On average, half of all big business decisions are based on gut feelings, but managers will never admit to this. Instead, they waste intelligence, time and money in making them look like data-driven choices. Or even worse, almost a third of the time managers will engage in defensive decision-making and suggest the second or third best option to deflect responsibility in case something goes wrong. This damaging behaviour affects other fields, such as politics and healthcare, as well. In the US, 93% of doctors admit to defensive decision-making in order to protect themselves from litigation.

Andrew Haldane, the Bank of England’s executive director for financial stability, explained in his ‘The dog and the frisbee’ speech, that in complex decision-making problems, simple rules sometimes do just as well as complex solutions, if not better. For Gigerenzer, choosing between the two is simple: in situations with little uncertainty, a low number of alternatives and large amounts of data, apply complex solutions and big data. But when faced with high uncertainty, a large number of alternatives and small amounts of data, simple solutions and heuristics will work better.

Gigerenzer finished his talk by illustrating this idea that less can be more with an example from the world of sports. Researchers at the University of Chicago showed what happened if you give a group of amateur and professional golfers three seconds each to complete their swing. Beginners will get worse because they need time to think about their position and their movements. Experts, on the other hand, will get better. Their expertise is in the unconscious, in the same way that hunches operate in the unconscious mind. If an experienced person has a strong gut feeling about a decision, listen to them without always requiring a fact-based justification.

Gigerenzer’s principles are easy to follow: Always ask for absolute risks and what percentages stand for; consider whether you are dealing with risk or uncertainty and make decisions accordingly; remember that simple solutions can sometimes be better than complex ones and, finally, that heuristics and intuition are just as valuable as statistical thinking. Our modern democracies need risk-savvy citizens who will not be manipulated into unrealistic hopes and fears or easily threatened into surrendering their money, their welfare or their liberty.

The talk ‘Risk Savvy’ took place on 20 May 2014 at the Royal Institution.