Science is a risky business: all measurements have inherent errors associated with them, and future events can never be predicted with 100% accuracy. But as more areas of society become intertwined with science, experts are being called upon to communicate their scientific insights quickly and clearly, but accurately – including the uncertainties. That won’t be easy, but it should be possible, says Brigitte Nerlich.
Many articles have already been written about the case of the three seismologists, two engineers, one volcanologist and the public official who were sentenced in L’Aquila, Italy, to be jailed for six years. These members of the National Commission for the Forecast and Prevention of Major Risks were found guilty of manslaughter for ‘falsely’ reassuring or ‘over’-reassuring people in the Italian town of L’Aquila about the likelihood of a major earthquake at the end of March 2009. Soon after their conclusion that the risk of an earthquake was minimal, the town was struck by a major earthquake and more than 300 people died. Prosecutors and victims’ relatives claimed that 29 of those who died would have left their houses if they hadn’t felt reassured by the authorities, and that these lives could therefore have been saved.
This verdict has started a process of rethinking what it means to make predications, engage in risk and uncertainty communication, and give scientific advice in high-risk situations involving public confusion about risk and uncertainty. This is a highly complex affair and one fraught with almost irresolvable dilemmas. Here I can only highlight some of them. Scientists are generally exhorted to be open and honest about uncertainties. This is difficult in situations where certainty is what people expect to hear.
So how can scientists navigate between the Scylla of being open about uncertainty and the Charybdis of public and political expectations regarding pronouncements of certainty, as well as, and perhaps more importantly, between the rock of scare mongering and the hard place of ‘complacency mongering’? There are quite a few obstacles in the way. Making predictions is not just a matter of being right or wrong. There are two kinds of ‘right’ and two kinds of ‘wrong’, all with different costs and benefits. You can predict that something will happen and be right or wrong or you can predict that it won’t happen and again be right or wrong. These are called true positives, false positives, true negatives and false negatives, respectively. The real problem is that these things trade off against each other so that more true positives can only be made at the risk of false positives and more true negatives can only be made at the risk of false negatives. There is no simple way ‘just to be right’. Scientists are generally trained to think that false positives should be avoided, while false negatives are relatively harmless. (In an experimental lab you do not publish a finding until the evidence is conclusive. Meanwhile there is no finding and nothing to say). But in real world terms this can mean not giving a warning because the danger is not proven, when in fact the warning would have been helpful. If your sole aim were to avoid false alarms you would never give any warnings. And if your sole aim were never to miss a real danger you would issue warnings every day. Neither of these situations is satisfactory and all the real options involve a mixture of both disadvantages.
So what can scientists do about communicating risk in such a profoundly dilemmatic situation when, in addition, the communication of risk is full of complex political and communicative pitfalls and slippages, as was the case in L’Aquila? After the L’Aquila verdict, David Spiegelhalter, Winton Professor for the Public Understanding of Risk at the University of Cambridge, provided some advice about how one may be able to proceed. The most important points he makes are, to slightly paraphrase: (1) Never give advice unless confident that the findings will be communicated either by yourself or a trusted professional source, using a pre-determined plan and appropriate, carefully chosen language that acknowledges uncertainty and does not either prematurely reassure or induce unreasonable concern.(2) Do not engage in informal communication using social media on that issue. (3)Ensure proper indemnity arrangements are in place.
Peter Sandman and Jody Lanard, two experts on risk communication, also provided some good advice. They stress that it is important to “alert scientists to their obligation to inform the public candidly about uncertain risks, instead of giving in to the temptation to over-reassure.” They go on to say: “To get the job done, experts have to proclaim their uncertainty, insisting aggressively that they’re not at all sure about the things they’re not sure about.”These are some signposts and lighthouses that might allow scientists to navigate the dangerous seas of risk communication, but it will always be difficult.
Brigitte Nerlich is Professor of Science, Language and Society at the University of Nottingham, UK. She studies the debates that surround important contemporary scientific topics like emerging diseases and climate change, with a special emphasis on the use of metaphors.