A decade ago, Hugo Mercier and some colleagues wrote an article with the catchy title, “Epistemic Vigilance”. It argued that humans are naturally sceptical and not easy to convince, let alone fool.
Many people were… unconvinced.
“What about flat-earthers?” they would ask Mercier. “What about people voting for the Nazis?” Like the rigorous scholar he is, Mercier would go away every time and do some more research, to check that he hadn’t missed a glaring counter-example that proved him wrong. But instead, he found more and more evidence that people are not as gullible as we often assume.
Most people were not taken in by Nazi propaganda, or won over by Hitler’s speeches. Even vocal believers that the earth is flat hold that belief separate from the rest of their lives, which allows them to carry on as normal.
In the end, Mercier told me, he had enough material for a book, “a long argument against the idea that humans are gullible”. Not Born Yesterday analyses the historical examples and classic psychology experiments often cited to show how willingly we suspend our critical faculties and follow the crowd, or follow orders.
The notorious Milgram obedience experiments, for example. Subjects followed orders to the point of delivering what they thought were agonising and dangerous electric shocks to fellow volunteers (in reality, actors). But those who took the whole set up at face value were less likely to comply with the experimenters’ instructions than those who expressed doubts, and authoritarian orders were less effective than appealing to science, with elaborate explanations delivered by a white-coated experimenter.
So while it’s not, after all, horrifying proof that we’re all potential Nazis, it does reveal some nuanced points about context. We don’t take information (or instructions) at face value. We’re constantly weighing up what’s going on. Who is telling me this? Why are they saying it? How far should I trust them? How much weight should I put on my new beliefs?
This is cheering news in today’s world of Fake News and conspiracy theories. Mercier doesn’t think they have much impact on the real world.
During the 2016 US Presidential elections, online sources including InfoWars propagated the bizarre claim that top Washington politicians including Hillary Clinton were running a paedophile ring in the basement of a pizza restaurant in Washington DC. But the remarkable thing about the Pizzagate conspiracy theory is that only one guy actually turned up there with a gun.
Restaurants and individuals smeared in the conspiracy theory did receive threats and online harassment. But most people who expressed belief in the conspiracy theory did nothing more than post one star reviews for the restaurant on TripAdvisor. If they had truly believed there were abused children there, why didn’t they take serious action? Their belief was of a kind Mercier calls reflective. “Believing something — a rumour or anything else — is not an all-or-nothing matter… A belief can remain essentially inert, insulated from cognitive or behavioural consequences, if we don’t work out what inferences or actions flow from it.”
Now hang on, you may say. I’m constantly reading nonsensical things on the internet about coronavirus, claims that you can test for it by holding your breath, or that drinking water will protect you from catching it. Otherwise intelligent people share such potentially dangerous rubbish, claiming it comes from Stanford Hospital or Taiwan. Isn’t that proof that we’re all gullible if you invoke the right flavour of authority at a time of heightened fear?
The problem, suggests Mercier, is that we evolved our social intuitions about what — and whom — to trust in a different world, where we got to know others over a long time, face to face, and could test new ideas against our experience of the world. Scientific ideas tend to come from people (or institutions) we don’t personally know, and are often very counterintuitive. When our basic plausibility test — does this fit with what I know of how the world works? — fails us, we have to resort to reasoning — does this argument seem sound to me? — and to deciding who is trustworthy on this subject.
But if anything, Mercier argues, we are inclined to give ourselves problems not by trusting too easily, but by being too difficult to persuade. We default to the status quo, more receptive to ideas that reinforce our worldview than those that challenge it. We doubt the motivations of unknown sources. We “make more errors of omission (not trusting when we should) than of commission (trusting when we shouldn’t)”.
Trust, after all, is not a commodity but an action. We learn how to judge trustworthiness by trusting, and sometimes finding out the hard way when we are wrong. The social scientist Toshio Yamagishi ran experiments and found that “the most trustful of their participants — those more likely to think that other people can be trusted were also the best at ascertaining who should be trusted”.
Most of this evidence is not new. The book marshals a convincing body of research, some of it decades old, from history and sociology, from anthropology and from the psychology laboratory. It was already known, when Mercier went looking for research to answer the questions his critics asked.
Raising the question, as the book puts it, “If people are not gullible, why have scholars and laypeople through the ages, from Plato to Marx, claimed they were? … Isn’t the spread of this misconception a sign of gullibility?”, Mercier addresses this paradoxical belief in gullibility with the same tools he used on conspiracy theories, political propaganda, Fake News and anti-science scepticism.
First, it’s an appealing rumour, that comes with compelling stories of innocents duped and the threat of the irrational mob. Millions who weren’t fooled don’t make a shareable story, or bring any cachet to the teller. One person who fell victim is a tale of pity and fear, with room for a comforting thought that neither teller nor listener would have been that stupid.
Then, ask yourself if we really believe everyone is gullible? Everyone except us, of course. If we did, wouldn’t we all be making up preposterous stories to take advantage of their gullibility? It’s closer to a “reflective belief”, that we profess, but don’t generally use to make real-life decisions.
Taking a historical view, it has been socially and politically useful to dismiss the masses as gullible, like children or animals, easily swayed by emotion, by demagogues, or by psychological manipulation. The masses couldn’t be trusted with political power (if you wanted to preserve the status quo from radical change) or they are easily corrupted and make the ‘wrong’ political choices (if you want to explain why radical change still hasn’t happened). Democracy is harder to defend if you portray voters as gullible.
And yet, there is little evidence that voters are too easily swayed. Throughout history, demagogues have generally succeeded, argues Mercier, either by surfing existing ideas or by repression, using propaganda more as a signal of power than a means of persuasion. Even current microtargeting techniques using social media have been shown to have little or no effect on votes cast.
It takes a lot of persuasion, argument, and building relationships of trust, to change our minds. It does happen: look how social attitudes have shifted in the past few decades over issues such as homosexuality or racial equality. But in general, says Mercier, we need to be more, not less, open to persuasion and to changing our minds.
“I do hope you come to accept the core of the book’s argument,” he writes. “But please, don’t just take my word for it. I’d hate to be proven wrong by my own readers.”