Credit: Gareth Cattermole / Getty

These days, most UK members of Parliament are graduates ā 86%, according to the Sutton Trust. However, only a small minority of our politicians have degrees in the āSTEMā subjects of science, technology, engineering, mathematics or medicine. A Nesta survey of Parliamentary candidates for the 2017 general election, found that just 9% were STEM graduates. The adjacent profession of journalism is also heavily biased to the humanities.
Given the profound challenges posed by issues like climate change, genetic engineering and artificial intelligence, couldnāt we do with more scientific knowledge among our decision-makers and opinion-formers?
In fact, wouldnāt a generally higher level of scientific literacy among the public encourage a less polarised, more objective culture of politics?
Perhaps not. According to a eye-opening piece by Dan M Kahan forĀ Scientific American, knowing about science does not equate to objectivity on science-based issues:
āSimply put, as ordinary members of the public acquire more scientific knowledge and become more adept at scientific reasoning, they donāt converge on the best evidence relating to controversial policy-relevant facts. Instead they become even more culturally polarized.
āThis is one of the most robust findings associated with the science of science communication. It is a relationship observed, for example, in public perceptions of myriad societal risk sourcesānot just climate change but also nuclear power, gun control and fracking, among others.ā
It is said that āa little knowledge is a dangerous thingā, but it would seem that a lot is even worse.
But why would scientific literacy and skills in scientific reasoning result in less objectivity not more? Kahanās answer is that it all depends on how that knowledge and ability is used:
āā¦if someone does enjoy special proficiency in comprehending and interpreting empirical evidence, it is perfectly predictable that sheāll use that skill to forge even stronger links between what she believes and who she is, culturally speaking.ā
Perhaps this shouldnāt come as such a surprise. As Iāve writtenĀ before, the idea that ignorance of contrary facts and opinions is the great driver of polarisation is something of a myth. The reality is that most people are exposed to the arguments of the āother sideā, butĀ they tend to become more not less entrenched in their views as a result. One might have hoped that scientifically literate people considering scientific issues would be immune to this effect, but evidently not.
There is, however, a chink of light. Though neither scientific knowledge nor scientific reasoning ability is associated with objectivity, scientific curiosity is:
āIn general population surveys, diverse citizens who score high on the Science Curiosity Scale (SCS) are less divided than are their low-scoring peersā¦
āExperimental data suggest why. Afforded a choice, low-curiosity individuals opt for familiar evidence consistent with what they already believe; high-curiosity citizens, in contrast, prefer to explore novel findings, even if that information implies that their groupās position is wrong). Consuming a richer diet of information, high-curiosity citizens predictably form less one-sided and hence less polarized views.ā
Kahan describes curiosity as āa hunger for the unexpected, driven by the anticipated pleasure of surpriseā ā which is wonderful,Ā but what can science communicators do to encourage this frame of mind?
Perhaps itās more a question of what not to do. A campus culture of āsafe spacesā and ātrigger warningsā strikes me as rather discouraging of curiosity. Indeed, if we teach young minds to catastrophise challenges to oneās prior assumptions, then unexpected and surprising perspectives will be viewed with, at best, suspicion.
One could argue this is mainly a problem in the humanities, but the sciences are not immune ā and certainly not the public discussion of scientific subjects.
For example, how should science communicators deal with pseudo-sciences like homeopathy and astrology? One approach would be to equip people with the mental tools toĀ distinguish between hard evidence, interesting speculation and obvious nonsense. The other approach is to catastrophise the nonsense, viewing it as a civilisational threat that needs to driven out of the public sphere.Ā My entirely subjective impression is that weāve swung towards the latter approach in recent years.
I can understand why people should have a strong aversion to snake oil, but a paranoid reaction is counter-productive.
For a start, it mischaracterises the real threats to scientific progress.Ā For instance, Iād provisionally estimate the negative impact of TV astrologers on astrophysics at approximately zero (though Iām open to evidence to the contrary).
Thereās also a danger that an atmosphere of censorious hostility will spill over from the obvious nonsense to issues where the scientific evidence is not quite so clear cut. Itās a tone that not only induces a defensive hardening of positions, but also kills the curiosity that is our best bet if we want people to objectively weigh up the balance of evidence. Thatās really important on highly complex issues like climate change, where skilled lobbyists can find elements of doubt, even though the bigger picture supports the case for action.
Ultimately, scientific progress depends on people who are willing to step beyond what is already established and investigate what is not. There are good and bad ways of going about this, but it all begins with curiosity.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
Subscribe