He's probably a Popperian (ANGELA WEISS/AFP via Getty Images)


August 31, 2021   7 mins

Imagine you bought a book with the title How to Talk to A Contemptible Idiot Who Is Kind of Evil. You open the book, and read the author earnestly telling you how important it is that you listen, and show empathy, and acknowledge why the people you’re talking to might believe the things they believe. If you want to persuade them, he says, you need to treat them with respect! But all the way through the book, the author continues to refer to the people he wants to persuade as “contemptible idiots who are kind of evil”. 

At one stage he even says: “When speaking to a contemptible idiot who is kind of evil, don’t call them a contemptible idiot who is kind of evil! Many contemptible idiots find that language insulting.” But he continues to do it, and frequently segues into lengthy digressions about how stupid and harmful the idiots’ beliefs are. Presumably you would not feel that the author had really taken his own advice on board

This is very much how I feel about How to Talk to A Science Denier, by the Harvard philosopher Lee McIntyre.

McIntyre wants to help us change people’s minds. Specifically, to help us change the minds of these strange, incomprehensible people called “science deniers”. He addresses five main groups of “deniers”: flat earthers; climate deniers; anti-vaxxers; GMO sceptics; and Covid deniers.

This is, on the face of it, an important project. It’s a truism that the world is polarised, and our sense of shared reality is under attack. If there is some way of learning how to talk across difference, and to persuade without attacking, that might go a long way to bridging our various divides, not just the five he discusses.

The framing is that McIntyre goes and meets representatives of these groups and tries to persuade them out of their wrong beliefs. He goes armed with social-psychology research about how best to persuade people. His big trick (which I think is a good, if limited, one) is asking: what evidence would it take to make you change your mind?

But the whole book is premised on one idea: McIntyre is right, and the people he is “talking to” are wrong. 

And it’s true that all five groups are wrong, or at least their central claims are. The earth is in fact an oblate spheroid; the climate is warming, due to human influence, and will likely have severe negative impacts; vaccines work; GMOs are safe; and Covid is real.

The trouble is that by using these groups, McIntyre is playing on easy mode. When your example of a “science denier” is a literal flat-earther, it’s easy to say “look over there at the crazy deniers”.

Even with climate change scepticism, sure, there are people who literally don’t believe that anthropogenic greenhouse gases are warming the planet. But those people are relatively rare. People who believe that anthropogenic greenhouse gases are warming the planet, but that the emissions are going to be hard to stop because of economic growth in the developing world and it would make more sense to concentrate on adaptation rather than mitigation, are much more common. Are they “deniers”? Certainly they’re often called deniers. But McIntyre himself acknowledges that China is by far the largest emitter of greenhouse gases and that the IPCC says the sweeping global changes required to cut emissions sufficiently to avoid a 1.5°C warming are unprecedented.

McIntyre constantly wants to make a clean distinction between “science deniers” and non-deniers. So, for instance, he says that there are five “common reasoning errors made by all science deniers” [my emphasis]. They are: cherrypicking, a belief in conspiracy theories, a reliance on fake experts, illogical reasoning and an insistence that science must be perfect. If you don’t make all five of those errors, you’re not an official McIntyre-accredited science denier.

Hang on, though. A “belief in conspiracy theories”? McIntyre spends a lot of time talking about the tobacco firms who manufactured doubt in the smoking/lung cancer link, and the oil firms who did the same with the fossil fuel/climate change link. He says that the spread of Covid denialism through the US government was driven by Republican desire to keep the economy open and win the election. Aren’t these conspiracy theories?

Ah, but for McIntyre these aren’t conspiracy theories, they’re conspiracies. The distinction is “between actual conspiracies (for which there should be some evidence) and conspiracy theories (which customarily have no credible evidence).”

So, since some anti-vaxx conspiracy theories like the polio vaccine giving children polio, or the CIA using fake vaccination stations to take people’s DNA, are true, does that mean anti-vaxxers don’t believe in “conspiracy theories” but “conspiracies”?

Obviously not. But the point is that there’s not some clear line between “real conspiracies” and “conspiracy theories”. When Alex Jones says that chemicals in the water are turning frogs gay, he’s referring to real claims that endocrine disruptors are affecting sexual development in lots of animals. It’s not easy to draw a line between real and fake, evidence-based and not evidence-based.

I think the basic problem is that McIntyre is a Popperian. That is, in hugely oversimplified terms, he believes that no amount of evidence can confirm a theory: but evidence can falsify it. “If we find only evidence that fits our theory, then it might be true,” he writes. “But if we find any evidence that disconfirms our theory, it must be ruled out.” 

I, on the other hand, am a Bayesian. I have some prior belief and I assign some level of probability to it: “climate change is real and dangerous”: 90%; “the world is flat”, 0.1%. And then each new piece of evidence shifts my belief a little: if next year NASA say “we got new photos in, looks like Earth is sitting on the back of a turtle”, then I’ll upgrade my belief in a flat earth to, I dunno, 1.5% (but also upgrade my belief in there being mad people at NASA to 95%).

So I don’t need to draw a bright line between “denial” and “reality”. I can say: “I think it’s likely that tobacco firms conspired over lung cancer, but I think it’s pretty unlikely that NASA faked the moon landings.” And I can update my beliefs as new evidence comes in. I don’t have to “rule anything out”, I can simply downgrade how likely it is.

McIntyre, though, is stuck with two categories: things that might be true; and things which have been “disconfirmed”. If you believe things that have been disconfirmed, then you must be a “denier”. And so he needs to find ways of explaining why these “deniers” are so different from the rest of us. 

He has various ideas about “inflated self-confidence, narcissism, or low self-esteem”. But if you reject the idea that there are two groups of people, “deniers” and “non-deniers”, then you can avoid the need to explain it at all, beyond saying “some people are better than others at working out what’s true”. 

But we’re not just here for his epistemology: we’re here for a masterclass in how to persuade people out of false beliefs. Over the course of the book he meets various people — the flat earthers; two coal miners; a couple of hippyish friends of his — and tries to talk to them about their beliefs, using the methods he has learnt. His solution is to listen, to be respectful, to meet people face to face, and to do so over several meetings. Does his approach work?

In short: no. Hilariously, both of the coal miners he meets cheerfully accept the reality of climate change, but say that the economic value is worth the potential damage to the climate. His first hippyish friend is entirely pro-vax and only slightly GMO-sceptical; the other one is anti-GMO but on anti-corporate grounds rather than safety ones. 

So he falls at the first hurdle: he not only doesn’t convince anyone, he doesn’t meet anyone who unambiguously disagrees with him, except the flat earthers. (He also struggles to be respectful, at least in the book itself. There’s an astonishing line on p77 in which he says “When speaking to them, we should remember that it is an insult to use the word ‘anti-vaxxer.’” There are 109 uses of the term “anti-vaxx” in the book. Occasionally he remembers and says how important it is that we listen and pay attention, then he immediately reverts to calling, say, Covid scepticism “ridiculous conspiracy theories and partisan nonsense”.)

But there’s a bigger problem. McIntyre’s big question, as mentioned, is asking: What evidence would it take to change your mind? But at no point does McIntyre ever ask himself what it would take to change his mind. 

For instance: when he was talking to the Pennsylvania coal miners, he accepted that they were just trying to feed their families. I assume he’d also acknowledge that Chinese coal mining is allowing that country to get richer and improve its citizens’ way of life. But I don’t think I’m misrepresenting him when I say that he thinks coal mining is a disaster.

When he talks to a friend of his about GMOs, though, that friend says that even though GMOs can save lives now (in the form of golden rice), they’ll cause disaster in the future. McIntyre says, OK, so the kids who can’t get the golden rice now, they’re just going to die? And his friend says yes. McIntyre says that’s easy for him to say, “because he had money and wouldn’t be one of the ones who suffered”.

The exact same question, though, can be asked about coal mining. Sure, McIntyre can say stop using coal, and it’ll help prevent future disasters. But it will also presumably mean some number of tens or hundreds of millions of Chinese people losing electric lights and functioning hospitals, and a smaller number of Pennsylvanians losing their jobs. McIntyre himself would be fine, except for somewhat higher electricity bills. 

Is the tradeoff worth it? McIntyre clearly thinks so (and I think I do too): but what would change his mind? I can tell you: I would update my beliefs significantly if you showed me a utilitarian calculation showing that more people would be harmed by ending coal mining than by continuing it. But McIntyre never asks himself the question. He is stuck on transmit, never on receive.

What’s sad is that he sometimes comes close. He recognises that beliefs are part of people’s identity, and that that makes it hard to change them – but again, applies the lesson only to the weird, wrong, other people, not to himself and people like him. The near-total lack of introspection renders the whole grand project largely meaningless. I am right, you are wrong, the only thing we need to discuss is how to make you realise how wrong you are. The idea of working together to establish a shared reality is hamstrung by his certainty that the reality that needs to be shared is his one.

It’s mainly a book designed to tell readers that people they already think are dumb are, in fact, dumb. It is, really, How to Talk to A Contemptible Idiot Who Is Kind of Evil.


Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers