Credit: Gareth Cattermole / Getty


December 18, 2018   3 mins

These days, most UK members of Parliament are graduates – 86%, according to the Sutton Trust. However, only a small minority of our politicians have degrees in the ‘STEM’ subjects of science, technology, engineering, mathematics or medicine. A Nesta survey of Parliamentary candidates for the 2017 general election, found that just 9% were STEM graduates. The adjacent profession of journalism is also heavily biased to the humanities.

Given the profound challenges posed by issues like climate change, genetic engineering and artificial intelligence, couldn’t we do with more scientific knowledge among our decision-makers and opinion-formers?

In fact, wouldn’t a generally higher level of scientific literacy among the public encourage a less polarised, more objective culture of politics?

Perhaps not. According to a eye-opening piece by Dan M Kahan for Scientific American, knowing about science does not equate to objectivity on science-based issues:

“Simply put, as ordinary members of the public acquire more scientific knowledge and become more adept at scientific reasoning, they don’t converge on the best evidence relating to controversial policy-relevant facts. Instead they become even more culturally polarized.

“This is one of the most robust findings associated with the science of science communication. It is a relationship observed, for example, in public perceptions of myriad societal risk sources—not just climate change but also nuclear power, gun control and fracking, among others.”

It is said that ‘a little knowledge is a dangerous thing’, but it would seem that a lot is even worse.

But why would scientific literacy and skills in scientific reasoning result in less objectivity not more? Kahan’s answer is that it all depends on how that knowledge and ability is used:

“…if someone does enjoy special proficiency in comprehending and interpreting empirical evidence, it is perfectly predictable that she’ll use that skill to forge even stronger links between what she believes and who she is, culturally speaking.”

Perhaps this shouldn’t come as such a surprise. As I’ve written before, the idea that ignorance of contrary facts and opinions is the great driver of polarisation is something of a myth. The reality is that most people are exposed to the arguments of the ‘other side’, but they tend to become more not less entrenched in their views as a result. One might have hoped that scientifically literate people considering scientific issues would be immune to this effect, but evidently not.

There is, however, a chink of light. Though neither scientific knowledge nor scientific reasoning ability is associated with objectivity, scientific curiosity is:

“In general population surveys, diverse citizens who score high on the Science Curiosity Scale (SCS) are less divided than are their low-scoring peers…

“Experimental data suggest why. Afforded a choice, low-curiosity individuals opt for familiar evidence consistent with what they already believe; high-curiosity citizens, in contrast, prefer to explore novel findings, even if that information implies that their group’s position is wrong). Consuming a richer diet of information, high-curiosity citizens predictably form less one-sided and hence less polarized views.”

Kahan describes curiosity as “a hunger for the unexpected, driven by the anticipated pleasure of surprise” – which is wonderful, but what can science communicators do to encourage this frame of mind?

Perhaps it’s more a question of what not to do. A campus culture of ‘safe spaces’ and ‘trigger warnings’ strikes me as rather discouraging of curiosity. Indeed, if we teach young minds to catastrophise challenges to one’s prior assumptions, then unexpected and surprising perspectives will be viewed with, at best, suspicion.

One could argue this is mainly a problem in the humanities, but the sciences are not immune – and certainly not the public discussion of scientific subjects.

For example, how should science communicators deal with pseudo-sciences like homeopathy and astrology? One approach would be to equip people with the mental tools to distinguish between hard evidence, interesting speculation and obvious nonsense. The other approach is to catastrophise the nonsense, viewing it as a civilisational threat that needs to driven out of the public sphere. My entirely subjective impression is that we’ve swung towards the latter approach in recent years.

I can understand why people should have a strong aversion to snake oil, but a paranoid reaction is counter-productive.

For a start, it mischaracterises the real threats to scientific progress. For instance, I’d provisionally estimate the negative impact of TV astrologers on astrophysics at approximately zero (though I’m open to evidence to the contrary).

There’s also a danger that an atmosphere of censorious hostility will spill over from the obvious nonsense to issues where the scientific evidence is not quite so clear cut. It’s a tone that not only induces a defensive hardening of positions, but also kills the curiosity that is our best bet if we want people to objectively weigh up the balance of evidence. That’s really important on highly complex issues like climate change, where skilled lobbyists can find elements of doubt, even though the bigger picture supports the case for action.

Ultimately, scientific progress depends on people who are willing to step beyond what is already established and investigate what is not. There are good and bad ways of going about this, but it all begins with curiosity.


Peter Franklin is Associate Editor of UnHerd. He was previously a policy advisor and speechwriter on environmental and social issues.

peterfranklin_