At UnHerd we believe that ‘echo chambers’ – i.e. bubbles of like-minded opinion where contradictory arguments go unheard – are a bad thing.
Of course, that belief is underpinned by a number of assumptions that should themselves be questioned. For instance, what about the research showing that, despite the polarisation of online debate, people are frequently exposed to the arguments of the ‘other side’, and, moreover, tend to become more hardened in their own opinions when they are?
Does that mean that echo chambers are nothing to worry about or even a good thing? Actually, we do need to worry – but to understand why we need a better description of the problem.
In an essay for Aeon, C Thi Nguyen makes a very useful distinction between echo chambers and ‘epistemic bubbles’:
“…there are two very different phenomena at play here, each of which subvert the flow of information in very distinct ways. Let’s call them echo chambers and epistemic bubbles. Both are social structures that systematically exclude sources of information. Both exaggerate their members’ confidence in their beliefs. But they work in entirely different ways, and they require very different modes of intervention. An epistemic bubble is when you don’t hear people from the other side. An echo chamber is what happens when you don’t trust people from the other side.”
In short: bubbles restrict information; chambers restrict trust.
Nguyen goes on to explain that filtering information isn’t necessarily a bad thing – if we didn’t, we’d be overwhelmed by the stuff. In any case, it’s quite hard to construct an information filter that’s entirely impenetrable to ‘inconvenient truths’ – sooner or later all but the most determined hermit is going to encounter contradictory information.
This brings us to the real problem – which is not that such information is unheard, but that it’s untrusted. For the truth to change a mind (or at least broaden it), getting through the information filter (i.e. the epistemic bubble) isn’t enough. It also has to get through the trust filter (i.e. the echo chamber):
“An ‘echo chamber’ is a social structure from which other relevant voices have been actively discredited. Where an epistemic bubble merely omits contrary views, an echo chamber brings its members to actively distrust outsiders…”
“The result is a rather striking parallel to the techniques of emotional isolation typically practised in cult indoctrination. According to mental-health specialists in cult recovery, including Margaret Singer, Michael Langone and Robert Lifton, cult indoctrination involves new cult members being brought to distrust all non-cult members. This provides a social buffer against any attempts to extract the indoctrinated person from the cult.”
Nguyen goes on to say that you can’t get someone out of an echo chamber by “bombarding [them] with ‘evidence’.” Indeed an aggressive assault on an information filter may well provoke a reinforcement of that more formidable barrier – the trust filter.
But if some information filters can be useful, can’t the same be true of some trust filters? Isn’t it entirely natural – and, in many cases, essential – to trust some sources more than others?
The answer to that is obviously ‘yes’, which leaves us all with a constant challenge: how to distinguish a rational trust filter from a cultish echo chamber.
Here are three questions to ask about the trust filters that you place between yourself and the world:
- Firstly, do you (a) assume good faith on the part of outsiders unless they actively forfeit your trust or (b) distrust outsiders by default?
- Secondly, on what grounds do you ever distrust outsiders – is it primarily because of (a) flaws in their evidence / methodology or (b) who they are?
- Thirdly, are those who share your trust filters (a) subject to the same criteria that you apply to outsiders, or (b) do they get a free pass?
If all or most of your answers are (b) then, trust me, you’ve got a problem.