For much of human history, doubt was considered a personal vice. Status and advancement was generally conferred on believers and cheerleaders for the prevailing orthodoxy. Questioning the status quo was regarded as sedition and, as a result, discussions of “doubt” were confined to pedantic philosophers determined to discover whether anything in the world could really be known.
It was not really until David Hume, writing during the Scottish Enlightenment, that an attempt was made to reconcile Scepticism with the real world. Frustrated at the “insipid raillery” of those who claimed mankind could know nothing, he dismissed their obscure thought experiments as “mere Philosophical Amusement”, and instead chose to reclaim Scepticism as a critical mindset. To put it simply, for Hume it was important to be “a philosopher; but, amidst all of your philosophy, be still a man”.
Like what you’re reading? Get the free UnHerd daily email
Already registered? Sign in
At the end of last year, Edinburgh University renamed its David Hume Tower because the philosopher — at least according to those who demanded the change — “wrote racist epithets”. Whether or not that is true is an argument for another day. But this erasure of Hume – the Enlightenment philosopher who reclaimed the meaning of scepticism more than 200 years ago – is symbolic of something far more significant. For it seems to me that the term “sceptic,” and the attitude it represents, is once again in urgent need of rehabilitation.
On paper, that shouldn’t be too difficult. As Hume put in his Enquiry Concerning Human Understanding, to be sceptical is “to begin with clear and self-evident principles, to advance by timorous and sure steps, to review frequently our conclusions, and examine accurately all their consequences”. At the time, this was radical. It encompassed everything progressive about the Enlightenment and the emergence of the scientific method. But it also seems eminently sensible. Who wouldn’t want to be a sceptic today?
Apparently, quite a lot of people. Scepticism is suddenly perilously out of fashion. More than that, it is now deemed dangerous. The reason? The rise of the “lockdown sceptics”, who in recent weeks have taken a battering for having made claims about the virus that turned out not to be true.
In a sense, this is what should happen in the scientific method — commentators and experts being held to account for predictions they make. But the ferocity of the attacks has left us at a place where all questioning groups are subjected to the same moral condemnation. Whether they are pundits peddling conspiracies, credentialed scientists recommending alternative approaches, or intellectuals worried about the political implications — “Lockdown sceptics” is used interchangeably for them all. Any dissent will mark you out as part of the global “anti-science” movement. So sceptic has become a dirty word.
In our own LockdownTV interview series, which kicked off early in the pandemic, we put the scientific method into action. We featured eminent scientists from all sides of the debate sharing their views: advocates for a stronger response such as Profs Neil Ferguson, Nathalie Dean, Devi Sridhar and Fredrik Elgh put forward their arguments alongside advocates of a more liberal approach such as Johan Giesecke, Anders Tegnell and the Great Barrington scientists. As you might expect, some statements aged better than others; some voices became more credible and others fell away, but it was all conducted in a spirit of civility and respectful enquiry.
When Nobel prize-winner Michael Levitt’s prediction that the American epidemic would end in August failed to materialise, we had him back on the show to account for himself. Both Giesecke and Tegnell have also pledged to return to discuss their learnings later this year.
But the treatment of some of these scientists is troubling. Oxford Professor Sunetra Gupta is now bracketed with out-and-out cranks by MPs and columnists in national newspapers. I know her to be a kind, intelligent and accomplished person who cares deeply about saving lives. Perhaps she shouldn’t have speculated so specifically about the Infection Fatality Rate in her UnHerd interview so early in the pandemic; perhaps she should have caveated her theory that the epidemic was “on the way out” with the likelihood that it could come back again in the winter.
But she was a scientist giving her interpretation of the data at the time, not a trained politician stiffly parsing every word. The attacks on her fellow signatories of the “Great Barrington Declaration” by academic colleagues at Harvard and Stanford, as well as the “blood on their hands” rhetoric they regularly receive in the British press, sets a disturbing precedent for the next scientific controversy. Who would be a dissenting scientist now?
Creative thinking should be encouraged, not penalised. If new variants or developments extend this lockdown period beyond the spring, a more differentiated approach closer to the one the GBD posited — offering better protection to the vulnerable and letting others get on with their lives — may yet become relevant. At the briefing Boris Johnson received from Gupta, Carl Heneghan and Anders Tegnell, he rejected their proposals and continued with a lockdown strategy regardless — fair enough. But the Prime Minister should surely be praised rather than condemned for wanting to test and retest his assumptions?
At the level of government policy, the effect of the pandemic on attitudes should also give us pause. In the relentless comparison of different countries’ outcomes, the recipe for success on Covid-19 policy appears to be something like: presume new risks are worst-case scenarios and act immediately, and dramatically, rather than waiting for more evidence. Countries which acted in the opposite way to David Hume’s “timorous and sure steps” and swiftly imposed border controls and restrictions have generally had the best Covid-19 outcomes; “dithering” Western nations are unfavourably compared to their more commanding Asian counterparts. But while this may have been effective in the context of the pandemic, as a general lesson for government it would represent a terrifying new direction.
Many people might feel that we are in a wartime situation with this virus, and that something closer to martial law is therefore appropriate for this exceptional period. But as human rights lawyer Adam Wagner made clear in yesterday’s UnHerd interview, if history is any guide, emergency measures have a way of becoming permanent — most of the powers taken by the government during 9/11 are still there. We have now been governed in a “Napoleonic” style for almost a year, with new updates coming by ministerial decree on average every four and a half days and with very little oversight. Can anyone confidently say that none of that attitude will stick?
When it comes to the inevitable next virus or pathogen to be identified somewhere around the world, the playbook has surely now been written. As Wagner told me, “the danger is that if Covid never leaves us, or it mutates or a different virus arrives with a similar dynamic we’ll be in a semi-permanent state of “this is what we do” — when this happens, we have lockdowns, we have emergency laws, we take away parliamentary niceties like scrutiny, debates, votes, that sort of thing… And I think that is a danger that doesn’t come out of the fringes of the lockdown sceptic movement. That’s the real deal as a worry.”
At times of crisis, scepticism can be unnerving and the temptation to try to silence dissenting voices is understandable. But which is the bigger danger? That people are allowed to question the orthodoxy and potentially get things wrong but are held accountable in an open debate? Or that sceptical voices are censored for “misinformation”, and no one dares dissent? Everyone loses when doubt becomes a vice once more.
Join the discussion
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.Subscribe