We have heard a lot in recent years about ‘fake news’. A concept that is not only ill-defined but was weaponised from the outset so that it has come to mean little more than: ‘News I do not like or agree with.’
Yet, if we agree that there is at least one particular societal health-hazard included in the definition (that is, news which is wholly or partially fraudulent), then we ought also to recognise that the concept has a number of counterparts that are equally capable of demeaning the public discourse. Foremost among these is something that does not even have a name. But if it did, it might be ‘fake gatekeeping’. Although ‘erroneous gatekeeping’ might be more accurate.
I am thinking of those people who present themselves as referees of the era: keepers of the narrative. People who weigh up stories and events and pronounce not just what is right and wrong, but what is discussable and what is not — who allow themselves to declare what is and is not off limits.
The media plays a significant role in such activities, as do politicians. But so too do other actors, including university departments and think-tanks. And the problem with assuming the role of arbiter is that you must have 100% accuracy in your choice of targets. When you draw a line, it should be absolutely correct.
Otherwise there is an obvious danger. If you declare something to be untrue which turns out to be true, or if you claim something to be a ‘conspiracy theory’ when it should in fact simply be deemed ‘a theory’, then you don’t only diminish your own standing. If what you have labelled a conspiracy theory turns out to be true — or even within the realms of the possible — then it diminishes the ability of wider society to determine what is true and false. It casts doubt on the standing of other conspiracy-theories. And it tarnishes the concept of truth.
During the early weeks of the Corona virus pandemic, there was a fair amount of this erroneous narrative-gatekeeping. Outside of a relatively narrow group of experts, few public figures or institutions were warning people about the likelihood of the events that we are all now living through. And many of those who were focused on other problems — especially those individuals and institutions that seek relevance in order to secure funding – have had to do a sharp pivot.
Some are aware that they cannot become virologists in ultra-quick time, or have their views on pandemics suddenly listened to. But they can continue to attempt to decide where the parameters of reasonable discussion ought to sit. And it is here that a further erosion in trust between the public and self-identified experts is at risk of occurring.
A report by the Institute for Strategic Dialogue in London presents a fine example of what can happen when people pivot onto something they do not know about. The ISD has done work in recent years in what has become known as ‘the online space’, especially the issue of online extremism. When the Coronavirus came along, the ISD chose, like similar organisations, to try to keep relevant. In the ISD’s case this included publishing an online paper on alleged Covid-19 disinformation. It explained that this was part of a series of briefings from ISD’s Digital Research Unit which aimed to:
“…expose how technology platforms are being used to promote disinformation, hate, extremism and authoritarianism in the context of COVID-19. It is based on ISD’s mixture of natural language processing, network analysis and ethnographic online research. This briefing focuses on the way far-right groups and individuals are mobilising around COVID-19 in the US. The first briefing in the series can be found on ISD’s website.’
Given that extremists of all kinds, from the far-Left and far-Right to religious extremists are likely to try to use this moment and its aftermath to further their own political and religious ends, it seems worth trying to identify the games that they are playing. The better to warn people away from such bad actors and purveyors of disinformation.
The reason why the public may be especially susceptible right now is in part because in the wake of a relatively little thought about challenge, we don’t yet have the societal muscles to deal with it. We need to work our way through this exceptionally carefully.
Unfortunately, on this occasion, the would-be gatekeepers have not developed the requisite muscles either. The ISD briefing says that a far-Right online community has “mobilised” to “advance a range of… conspiracy theories relating to COVID-19”. As well as anti-Semitic tropes, the ISD lists as other ‘conspiracy theories’: the idea that this is a deep-state plot; the idea that it is a cover for celebrity arrests; and that the virus was developed in a Chinese laboratory. It should not be hard to spot which of these conspiracies listed is the odd-one out. Of the last, ‘bioweapon’ conspiracy the ISD says:
‘This theory is part of a wider right-wing conspiracy which some QAnon supporters have adopted. It claims that COVID-19 didn’t emerge from a food market in Wuhan but was rather engineered in a nearby laboratory and then released, either deliberately or accidentally. The main piece of evidence to support this claim is that China’s only Biosafety Level 4 lab (the maximum safety level used to deal with highly dangerous pathogens) is also located in Wuhan, and conservative media has repeatedly highlighted the connection, despite experts saying that there is absolutely no scientific evidence that the genome is man-made.’
Here is the problem. The ISD’s analysis attempts to put the idea that the virus originated in the Wuhan laboratory into the basket of ‘conspiracy theories’. But only weeks after the publication of the ISD’s paper a report put together by concerned Western governments keeps the possibility that the virus leaked from the laboratory in Wuhan on the table. At a time when the 5-Eyes intelligence network continues to investigate and take seriously the possibility that the virus leaked from the laboratory (whether deliberately or otherwise) why should this idea be deemed a ‘conspiracy theory’?
Of course, the ISD’s paper is not alone in trying to make this claim. But it is symptomatic of a problem which we must weigh up alongside the problem of fake-news. That is, the problem of faked authority. In particular, the problem that arises when non-experts decide what is and what is not permissible to say. Of course, it is their right to do so. But to allow it without challenge is to ignore the wider societal damage that is done when such inexact pronouncements are made. When the term ‘conspiracy theory’ is watered-down or made redundant as a term by being used of things that may be true, then we’re in trouble. It shuts down interrogation.
People who have been told that something is a ‘conspiracy theory’ only to learn that major western governments are looking into the exact same thing, might be forgiven for being more sceptical in future about the way in which ‘conspiracy theory’ is used as a gatekeeping term. In future, they may be far more sceptical of the term whenever it is used. At the far end of this some people may even decide that other ‘conspiracy theories’ are in fact true or at least plausible.
We all know the damage that can be done by the dissemination of false facts. But we should begin to consider whether equal damage is not done by those who try to stop people from considering questions that not only can be looked into, but must be.