Shadowy Facebook: Mark Zuckerberg Credit: Win McNamee/Getty


February 12, 2021   4 mins

What do you do when your sources of information get corrupted? That is one of today’s great questions, as UnHerd discovered this week. On Wednesday, Facebook censored an article on these pages which was critical of the World Health Organisation, labelling it as “misinformation”. It was not UnHerd’s first run-in with the online censors, but it is perhaps the most baffling.

In the article in question, Ian Birrell suggested that there are very many reasons to be suspicious of the WHO’s recent report into the origins of the coronavirus. Its investigations were brief, its research was flimsy and the composition of its team was questionable. But most glaring of all was surely its attempt to exclude from consideration anything which might be inconvenient for the Chinese Communist Party. It concluded, for example, that there was no evidence that the virus had come either from the Wuhan wet market, or from the government-run laboratory in the area.

Birrell remarked on all of this and much more in his piece. All of it is public information — and in any healthy society it would be part of the public debate. I suspect that this eventually dawned on Facebook, which last night apologised and reinstated the piece. But why did it decide that the article constituted “misinformation” in the first place?

It’s worth noting, of course, that Facebook does have form in regards to censorship involving the Chinese Communist Party, and it does seem a remarkable coincidence that the one UnHerd article to receive such a content warning was deeply critical of the world’s most powerful totalitarian state. Moreover, a pattern of Big Tech censorship has emerged in recent years where dissident voices are smothered until the embarrassment caused becomes too much of a PR own goal for a platform, at which point it is announced to be a simple mistake. Someone pressed the wrong key. Perhaps.

That certainly seems to be the gist of Facebook’s very brief explanation yesterday: “a fact-checking label was wrongly applied”. But in the absence of a more detailed statement, it’s still worth exploring what the company would say if it ever did decide to try to make a genuine effort at accountability. If that were to happen, I suspect it would go something like this: “the world is going through a pandemic and it is therefore exceptionally important that internationally recognised health organisations such as the WHO do not have their credibility undermined.”

Of course, that is just my conjecture — though, in light of Big Tech’s behaviour over the past year, it certainly seems conceivable. However, the problem with such an explanation is two-fold.

First there is the presumption that an international body like the WHO is not only not corrupted by the Communist Party of China, but that such a scenario is impossible. To see how naive this view is, we need only look at that other international organisation so often viewed as beyond reproach: the UN Human Rights Council in Geneva.

To an outsider, the UNHRC (like the WHO) may well sound like a venerable organisation. But look a little closer and it becomes clear that the entity is a farce. Only last month it allowed the North Korean representatives at the Council to spend their time expressing concerns about the human rights record of Australia. The truth is that these organisations are far less virtuous than one might think — and if we’re not allowed to criticise them, then what can we criticise?

The second problem with Big Tech censoring material critical of the WHO is more obvious: there is simply no consistency in the position. If it were the case that Facebook censored or flagged all online material which questions the WHO’s narrative then that would be one thing. But that is not what is happening. For in reality, Facebook only tends to censor material which goes against the advice of the WHO if it leans in a particular direction.

For instance, since the start of the pandemic the WHO has supported the introduction of lockdown measures. But it has also repeatedly said that such lockdowns should be temporary, or otherwise short in duration. Yet in recent months, Big Tech censorship has only been aimed at people arguing against lockdowns, or urging people to break the restrictions — while completely ignoring the many people still arguing for their extension. The problem, in other words, is a double-standard. If the WHO’s advice is sacrosanct, why is one alternative view worthy of censorship but another one permitted?

Ultimately, such an approach is entirely self-defeating. For if the credibility of the WHO is not as sacrosanct as Facebook seems to think, then by censoring dissenting views the platform is essentially protecting a polluted information source — all while claiming to be doing the very opposite.

The UK, among other countries, is now coming close to the first anniversary of the moment since everybody was first consigned to their houses. For much of that time we have been deprived our social antenna — unable to hear from friends and family in person or gather in large groups to exchange ideas. Instead, we have had to increasingly rely on the online world to communicate and share our thoughts. It is in light of that situation that sinister decisions such as Facebook’s this week need to be considered. This was not the first time that social media giants have shown themselves to be unaccountable, let alone incompetent. And I predict that it will not be the last.


Douglas Murray is an author and journalist.

DouglasKMurray