Humans are not fully rational creatures. We see the world through a collection of biases and rules-of-thumb which, in systematic and predictable ways, make us believe things that are simply wrong. They incite us to cling to arguments which support our beliefs, and reject arguments that don’t support them, in the face of contradictory evidence.
This is the motivating force behind large parts of our toxic discourse: it’s why left-wingers not only don’t share opinions with right-wingers, they often don’t even share access to the same facts. You can see it, right now, in the furious war over antisemitism in Labour. If you want to believe that Jeremy Corbyn’s Labour is antisemitic, it’s easy to find examples. If you want to believe that Corbynite antisemitism is a Tory smear, it’s easy to find Tories doing that smearing. Our minds are geared to pick up the things that support our arguments and ignore the ones that don’t.
In the last 40 years or so, psychological science has uncovered a large number of the biases behind this tendency. For instance, there’s the availability heuristic. It means that we judge the likelihood of something happening not by any sort of statistical process, but by how easily we can think of an example. And that means that we tend to think of more dramatic, memorable, or widely reported things – plane crashes, shark attacks – as being more likelier to occur than they are.
This makes us make bad decisions: for instance, in the year after 9/11, around 1,500 more people died on American roads than usual, because the terrifying images made the idea of hijacking and the plane crashes more available to people’s memories. But, in fact, flying is far safer than driving. The availability heuristic killed half as many people as died in the two towers themselves.
Then there’s scope insensitivity, which makes us blind to numbers. For instance, in one study, three groups of people were asked how much they would spend to save X seabirds from an oil spill. The first group was told that X was 2,000; the second, 20,000; the third, 200,000. The three groups’ answers were, respectively, $80, $78, an $88. Apparently we don’t think about the numbers; we just picture a sad bird covered in oil, and put a dollar value on how sad that picture makes us feel.
This has obvious implications for policy: when we read that the NHS is denying a child an expensive cancer drug, say, we become appalled at the image of the child dying because we wouldn’t spend money – but don’t think about whether that same money could save several children with less dramatic but more tractable diseases.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe[…] know that political bias warps cognition, sometimes catastrophically, and this is, I think, an example of that in action. Lepore read […]
[…] cognitive bias scope insensitivitymakes it very hard for us to think like this. If you ask people how much they’d be willing to […]