October 28, 2020

There was a tweet doing the rounds the other day, about our test, trace and isolate (TTI) system. (Our TTI system is, for the record, not doing very well.) It got thousands of retweets for saying: “Cost of track and trace system: Ireland £773,000; UK £12,000,000,000. Guess which one works?”

I was surprised by it, not because I believed the numbers, but because I was surprised that anyone believed the numbers. A moment’s thought — simply dividing the second number by the first number, something you can do in your browser search bar in under five seconds — would have told you that this implied the UK TTI system cost 15,000 times what the Irish one cost. Surely, put like that, it is unbelievable. But two thousand people retweeted it, and six thousand pressed “like”, so presumably most of them did not check.

In case you’re wondering, the problem was that the tweet compared the cost of the Irish TTI app with the money the British Government had put aside for the entire costs of TTI, including, for instance, the wages of the contact tracers; a fairer comparison would be with the £640 million or so that the Irish government will spend in total. It’s pretty obvious that the UK TTI system has been an expensive failure, but this comparison is specious.

You see these things all the time — claims about the world which a moment’s checking would reveal not to be true, and yet which people do not bother to check. It reminded me of something I read in Tim Harford’s recent book, How to Make the World Add Up. (Which is, it pains me to say since it’s on a related topic to my own upcoming book, excellent and interesting. Do me a favour: if you buy it on the back of my recommendation, at least buy mine as well.)

Harford points out that quite often, we aren’t all that interested in being right — or, rather, in having true beliefs. “Maybe this sounds absurd,” he says. “Don’t we all want to figure out the truth?” But often, believing things that are true is of much less importance than believing things which are socially acceptable. He gives the example of climate change, and farmers in Montana. Montana is a conservative, Republican-voting state; climate change is a politically charged term. 

Montanan farmers see the impacts of climate change all the time, in failed crops and bad harvests. But expressing a belief in climate change comes at a social cost — a Republican farmer could easily be ostracised by her friends if she does it. There is essentially zero cost to saying the opposite, though, because — even if climate change is real — what can you, as one of nearly eight billion contributors to it, actually do? 

The negative consequences of socially unacceptable beliefs are large, immediate and close; the negative consequences of objectively false beliefs are often negligible. “With a handful of exceptions — say, if you’re the president of China,” says Harford, “climate change is going to take its course regardless of what you say or do. From a self-centred point of view, the practical cost of being wrong is close to zero.”

If you are — as I am — a firm believer that climate change is a real and pressing concern, then it might be easy to dismiss this as simply “the climate deniers are at it again”. But we all do it. For a significant percentage of the things we believe, we care less about whether they are true than about whether they are socially acceptable.

Kevin Simler, co-author with Robin Hanson of The Elephant in the Brain, wrote a wonderful blog post about this a few years ago, looking at what he called “crony beliefs”. 

Our brains, he says, work to establish true beliefs about the world. If there is a tiger in the bush, it is important to establish a belief that there is a tiger in the bush, in order not to get eaten by the tiger. Less dramatically, if planting crops in March gives the best harvest, then it is important to establish a belief that planting crops in March gives the best harvest. True beliefs help us to stay alive, feed ourselves, that sort of thing.

But that’s not all they do. Simler draws a comparison with employees at a company in a town called Nepotsville, where, in order to do business, everyone knows that you have to hire people who are well-connected to the city council. 

“In this environment,” says Simler, “Acme faces two kinds of incentives, one pragmatic and one political.” Pragmatically, you need to do good work and finish it on time so that people hire you to do more. So you need to hire qualified workers and fire the ones who are underperforming. You need to act as a meritocracy.

But politically, you need to keep the council sweet, so you need to engage in what Simler calls “cronyism”. You need to hire, say, the Mayor’s nephew, even though you know him to be a useless layabout, because if you don’t, the council will make life difficult for you.

Someone looking around the office, not knowing how things work, might be surprised to find — scattered among the competent, hard-working employees — several nose-picking simpletons who, for some reason, have never been fired. They are surprised because they wrongly assume that those people are there to do work, rather than to provide political cover.

“I contend,” says Simler, “that the best way to understand all the crazy beliefs out there — aliens, conspiracies, and all the rest — is to analyse them as crony beliefs. Beliefs that have been ‘hired’ not for the legitimate purpose of accurately modeling the world, but rather for social and political kickbacks.”

So as well as meritocratic beliefs, which are about establishing true facts about the world — the best time to plant seeds, the presence or otherwise of a tiger — you have crony beliefs, which are intended to win you political or social credit. Simler quotes Steven Pinker: “People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true.”

Obviously this isn’t a hard and fast distinction: it’s not that all crony beliefs are false. For instance, as Harford says, while the Montanan farmers might be ostracised for professing a belief in climate change, someone in Portland, Oregon — or north London — might be ostracised for saying it’s a Chinese hoax. As it happens the north Londoner’s belief might be objectively true, but it would still be socially unacceptable not to believe it. By analogy, you might hire the Mayor’s nephew, and then it might turn out that he’s really good at his job. But he still provides you with political cover. 

Everyone has crony beliefs, on left and right; we all believe things, or at the very least profess them, because it is socially advantageous to do so. A left-wing equivalent of the climate change belief might be the claim that IQ is pseudoscience, for instance. 

That said, I suspect that people differ in the extent to which they are comfortable with them — as John Nerst points out here, the social category “nerds” is defined, partly, by “a concern for correctness over social harmony”, with a tendency to blurt out socially uncomfortable facts at inappropriate moments. By implication, then, most non-nerds are concerned with social harmony over correctness, and so will be more likely to believe things that are objectively false but which make living in society easier. (This makes sense to me. One of my own very proudest moments as a nerd was when the quantum computer scientist Scott Aaronson reviewed my first book and said that it had the rare quality of trying to assess ideas on “a scale from true to false, rather than from quirky to offensive”.) Still, we all have them. 

Spotting them is hard, and Simler makes a few suggestions: if you get angry when someone tries to correct your belief, that’s a sign that it may be a crony one. You wouldn’t get angry if you said “The Liverpool game kicks off at 7:45pm” and someone said “actually no it’s 8pm” — you actually need that information to make plans. But if you said “the 2019-2020 Liverpool team is the greatest club side in history” and someone replied “actually no it’s the Ipswich Town 2000-2001 team,” you might angrily defend Liverpool’s claim, if in fact you were using that belief to profess membership of a group — if it was acting as a crony. Similarly, if they’re vague and abstract, with few real-world consequences, that’s another warning sign.

A need to constantly proclaim your belief might be another point. You don’t need to go around telling people that you believe that your car needs petrol to move; that belief is boringly pragmatic, designed solely to help you get from A to B. But you might go around loudly comparing mask-wearers to Nazi collaborators, if the point of that belief was to demonstrate that you are a member of the Lockdown Sceptics tribe. (The reverse is also true. I might make loud public noises about masks being good, because the tribe of which I am a member is pro-mask. It feels like I genuinely believe that masks are good, and I think the evidence is very much on my side — but I imagine that’s how it feels to the Nazi-comparing chap I just mentioned. Crony beliefs can still be true, remember.)

The obvious question is what my crony beliefs are. But our brains have a whole array of tricks to keep us from noticing our own false beliefs; I don’t suppose I am going to be very good at spotting my own. My greatest concern is over things I’ve previously confidently proclaimed – for instance, that social media probably isn’t causing a wave of teen suicides and that there probably isn’t an epidemic of loneliness. If either of those turn out to be false, I’ll look silly, so it’s socially important for me that I continue to believe them. My repeated claims that the world is getting better and that we are generally becoming less bigoted as time goes on are the sort of things that could be cronyish, although – obviously – I do actually think that all of these things are true.

To return to the Irish TTI tweet, it seems pretty obvious, to me, what was going on. The tweet wasn’t true, but it signalled something that people wanted to express — that the Conservative government is incompetent and corrupt, funnelling money to its friends rather than trying to save lives. The belief, like Montanan farmers’ professed belief that climate change isn’t real, costs nothing or next to nothing in real terms — if you are wrong about the cost of TTI, it will make no difference to any decision you have to make. But if you hang around in left-liberal social circles, then signalling that you think the Tories are bad is socially advantageous, and saying “hang on a minute, actually” will only lose you friends. (Believe me, I know.) 

Assuming you do want to believe things that are true, Harford has recommendations: one key one is observing your own feelings about some claim, and seeing whether it makes you feel angry, or happy, or vindicated, and if it does, to wonder whether you believe it because you think it’s true, or you believe it because it’s socially advantageous. Simler, meanwhile, suggests that since crony beliefs are driven by social incentives, the trick is to create social incentives for believing true things. He points to the rationalists, a group of online nerds, as one group who are obsessed with being “less wrong”, and who have created a community where you are rewarded for obeying norms of truth-seeking and debate, rather than for believing specific things.

I hope these tricks work. In a world where thousands of people can believe that the UK TTI system is 15,000 times the price of the Irish one, we need all the help we can get.