Psychologists haven’t had a great few years. First there was the “replication crisis”, which kicked off in about 2011 and involved the gradual realisation that many of our best-known, most-hyped results couldn’t be repeated in independent experiments. Then there were the revelations that the American Psychological Association, one of the field’s most important professional bodies, had colluded with the US government on its torture programme during the Iraq War, then attempted to cover it up.
Then some of the most famous studies from social psychology’s 1970s heyday fell apart on closer scrutiny. The Stanford Prison Experiment, where people were assigned roles as “prisoners” and “guards”, and the guards ended up treating the prisoners abominably? Probably misreported. The study where “pseudopatients” admitted themselves to psychiatric hospitals, acted entirely sane, and were locked up and medicated regardless? Possibly fraudulent.
Now, psychologists are disgracing themselves anew over the coronavirus.
It started with articles relying on psychological “insights” to downplay the severity of the problem. In early February, social psychologist David DeSteno wrote a piece in the New York Times arguing that people get so caught up with their fear of the virus, they fail to understand that they’re unlikely to get it. Referencing some of his own lab experiments, DeSteno wrote that “…quarantine or monitoring policies can make great sense when the threat is real and the policies are based on accurate data. But the facts on the ground, as opposed to the fear in the air, don’t warrant such actions.”
Two days later, the New York Times’s Interpreter column quoted psychologist Paul Slovic, who noted that “[o]ur feelings don’t do arithmetic very well”, and that focusing on the coronavirus fatalities, and not the “98% or so of people who are recovering from it and may have mild cases” is skewing our judgement. The article argued that our fears, triggered by disturbing reports of “city-scale lockdowns and overcrowded hospitals”, overload our critical faculties, making us overreact to the threat the virus poses. The thought that those city-scale lockdowns and overcrowded hospitals might be a mere month away from the United States didn’t seem to occur.
Further psychological insights were provided by Cass Sunstein, co-author of the best-selling book Nudge, which used lessons from behavioural economics (essentially psychology by another name) that could inform attempts to change people’s behaviour. In an article for Bloomberg Opinion on 28 February (by which point there were over 83,000 confirmed coronavirus cases worldwide), Sunstein wrote that anxiety regarding the coronavirus pandemic was mainly due to something called “probability neglect”.
Because the disease is both novel and potentially fatal, Sunstein reasoned, we suffer from “excessive fear” and neglect the fact that our probability of getting it is low. “Unless the disease is contained in the near future,” he continued, “it will induce much more fear, and much more in the way of economic and social dislocation, than is warranted by the actual risk”.
On 12 March, the day after Italy had announced its 827th death from the virus, the eminent psychologist Gerd Gigerenzer published a piece in Project Syndicate entitled “Why What Does Not Kill Us Makes Us Panic”. It was, to say the least, confused: it opened with an acknowledgement that we don’t know how bad this epidemic could be, but immediately went on to make the case that we’d likely overreact, and failed to consider any opposing arguments.
Gigerenzer’s article discussed the 2009 swine flu pandemic, which affected hundreds of millions but killed relatively few, particularly in Europe, where the fear was nonetheless high. It also likened the coronavirus pandemic to terrorist attacks, which similarly kill small numbers in a relative sense, but — according to Gigerenzer — garner disproportionate attention. If only we could improve our “risk literacy”, Gigerenzer maintained, then we could “approach situations such as the COVID-19 epidemic with a cooler head”.
At the time of writing, Italy’s death toll is now above 10,000, with over 900 fatalties being reported in a single day. The epidemic in the UK has reached 1,000 deaths; in France, 2,000; in Spain, nearly 6,000. New York City is setting up makeshift morgues for the first time since 9/11. The suspiciously uniform daily death figures from Iran might belie a truly devastating epidemic in that country. The world’s economy has largely closed down, as we all anticipate the pandemic getting far worse before it gets better. How could all those psychologists have gotten it so disastrously wrong?
To back up his points about “probability neglect”, Sunstein had referred to a 2001 paper in the journal Psychological Science. It reported three experiments; Sunstein focused on the third one, which included 156 participants, all of whom were undergraduate students reasoning about how much they’d pay to avoid an imaginary electric shock. It’s not a criticism of the scientists to say that this experiment is only tenuously relevant to a global pandemic.
Indeed, alongside talk of the “replication crisis” there’s been discussion of a “generalisability crisis”, with renewed realisation that results from lab experiments don’t necessary generalise to other contexts. A global pandemic of a completely novel virus is, by its very definition, a context never encountered before. So how can we be sure that the results of behavioural science experiments — even those that are based on bigger or more representative samples than 156 undergrads — are relevant to our current situation?
The answer is that we can’t. Exploring the human capacity for bias and irrationality can make for quirky, thought-provoking articles and books that make readers feel smarter (and can build towards a tentative scientific understanding of how the mind works). But when a truly dangerous disease comes along, relying on small-scale lab experiments and behavioural-economic studies results in dreadful misfires like the articles we encountered above.
Although there are many behavioural phenomena that certainly seem relevant to today’s news — bias, sunk costs, the tragedy of the commons — it’s not at all clear how these concepts would be practically applied to do what needs to be done right now: slowing the spread of the disease.
We can even play the psychologists at their own game: there are also biases in the opposite direction to those discussed above. For instance, “exponential growth bias” might mean that our standard ways of reasoning break down in a situation where a threat accelerates like this virus. As observed in an insightful article on Italy’s ongoing horrific experience, there’s also “confirmation bias”, where we seek out evidence that confirms our previous beliefs or desires.
In this case, many were understandably desperate to believe that the virus wouldn’t be too much of a problem, and it seems to have led them to underreact very badly indeed. None of the psychological writers above seemed to consider these opposing biases — and ironically, it might have been because they were biased against thinking about them.
That latter idea is just speculation on my part — which is the whole point. As the psychologist Stefan Schubert has noted, “[i]t’s too easy to make up a just-so-story” about us being biased in one direction or another. “The ‘psychology of X’ (e.g. risk, disease) can only tell us so much about X. We should mostly look at X itself”.
Quite so: the critical task for now is not to try and get inside people’s heads, but to learn from the experience of other countries (like Taiwan, South Korea and Singapore) that have been more successful in containing covid-19; and to build realistic mathematical models of the epidemic, and to pour resources into medical advances — like tests, vaccines, and antivirals — that might help us to track and inhibit the transmission of the virus and prevent its symptoms.
Sunstein himself appears to have undergone something of a Damascene corona-conversion. On 26 March, he wrote another column for Bloomberg Opinion, entitled “This Time the Numbers Show We Can’t Be Too Careful”. In an argument that was directly opposed to his own reasoning of a month earlier, Sunstein suggested that the potentially ruinous effects of the pandemic mean that “the benefits of aggressive social distancing greatly exceed the cost”. Remarkably, he made no reference whatsoever to his previous article on this specific crisis — an article which, as the writer Ari Schulman has pointed out, had said that if you held the view Sunstein now holds, you were the sad victim of a cognitive malfunction.
It’s impossible to know how many people changed the way they thought about the virus after reading one of the misconceived articles above. But a consistent set of articles by credentialed scientists telling people they shouldn’t worry so much about the pandemic was the precise opposite of what was required in a situation where governments would eventually, and rightly, be imploring their citizens to stay at home at all costs. At best, the advice from psychology was unhelpful. At worst, it constituted dangerous misinformation.
And what of those governments? Could the “don’t overreact” advice from behavioural science have led politicians to drag their feet in dealing with the crisis, as some clearly have? Again, it’s hard to say — but it’s very possible that the broad message got through to the decision-makers.
Thankfully, the advice given to the UK government by behavioural scientists, which was published just over a week ago, didn’t contain any minimisation of the risks. In fact, it was — in my view at least — mostly banal and common-sensical, noting for instance that using the word “isolation” might have “negative overtones for older adults”, and that there might be unintended consequences if we closed schools.
Other government behavioural advisors, though, were less circumspect. The Behavioural Insights Team, a consulting company nicknamed the “Nudge Unit”, has been brought in to assist the UK’s response. At first it seemed their focus was on how to encourage handwashing, but there appears to have been mission creep.
For example, the team’s head, David Halpern, was interviewed as a “government coronavirus science advisor” about broad policies of “cocooning” older people. He was also quoted in support of the idea — which might yet seem grievously misguided in hindsight — that social-distancing measures should only be brought in gradually, to avoid people becoming fatigued. After he was reportedly “bollocked” by No.10 a fortnight ago for introducing the unfortunate phrase “herd immunity” into the national conversation, Halpern hasn’t (to my knowledge) been heard from again in public.
And perhaps that’s for the best. As intriguing as many psychological studies are, the vast majority of the insights we’ve gained from our research are simply not ready for primetime — especially in the case of a worldwide emergency where millions of lives are at stake. Much of the useful advice behavioural scientists can give isn’t really based on “science” to any important degree, and is intuitive and obvious.
Where they try to be counter-intuitive — for instance, arguing that people are wrong to find a global pandemic frightening — they simply end up embarrassing themselves, or worse, endangering people by having them make fewer pandemic preparations. This isn’t to say that psychology isn’t useful when it stays in its own lane: it’ll be important to ensure that as many people as possible have access to psychotherapy for the mental-health effects of the pandemic, for instance. But that’s a secondary effect of the virus: my argument here is that psychology can give little reliable counsel about our immediate reaction to the pandemic.
Psychologists should know their limits, and avoid over-stretching results from their small-scale studies to new, dissimilar situations. Decision-makers should, before using psychology research as the basis for policy, know just how weak and contentious so much of it is. And everyone else should stay at home, wash their hands — and beware psychologists bearing advice.