by UnHerd Staff
Wednesday, 17
November 2021
Chart
11:22

Is identity politics undermining science?

New research illustrates how funding has become politicised
by UnHerd Staff

The influence of Left-wing identity politics on the media is beyond doubt. That’s not an opinion, but a quantifiable fact. By counting the number of times that relevant terminology like “whiteness”, “male privilege” and “cultural appropriation” is used in mainstream publications like the New York Times, one can chart an explosion of coverage over the last decade. 

However, this isn’t just happening in the media. A new report from the Center for the Study of Partnership and Ideology (CSPI) looks at an area one would hope would be free from ideological influence: scientific research. 

The report’s author, Leif Rasmussen, uses natural language processing to analyse the abstracts of successful grant proposals to the National Science Foundation — the main grant giving body for scientific research in the US.

This is the key finding: 

As of 2020, across all fields 30.4% of successful grant abstracts contained at least one of the terms “equity,” “diversity,” “inclusion,” “gender,” “marginalize,” “underrepresented,” or “disparity.” This is up from 2.9% in 1990. This increase is seen in every field.
- Leif Rasmussen, CSPI

That’s a big change, but is it necessarily a bad thing? Though the CSPI describes the overall trend as “politicisation”, one could argue that science can remain objective while being made more relevant to the lives of previously marginalised communities. Scientists should also address specific issues such as the underrepresentation of certain groups in medical trials, which can lead to an incomplete understanding of the effectiveness and side-effects of treatments.

Credit: CSPI

If researchers are wording their grant proposals to reflect a genuine and useful effort to make science more inclusive, then that could be counted as a positive trend. However, the CSPI research also finds a marked and recent uptick in the use of overtly ideological language like “intersectional” and “Latinx”. 

Credit: CSPI

Another potentially worrying trend identified in the report is an “increase in similarity between documents that is particularly pronounced beginning in 2017.” In other words, the language used in grant proposals is becoming less distinctive. Why would that be? Politicisation might be one explanation, but it could just be that scientists are generally getting better at wording proposals to tick the boxes of the grant-giving bureaucracy.  

Whatever the reason, we should be wary about any trend towards groupthink. Though we certainly don’t need any old or new form of pseudo-science, the progress of actual science still depends on the ability to push beyond the established consensus. 

As long as scientific rigour and ethical standards are maintained, grant-giving bureaucracies should be encouraging scientists to think differently not the same. 

Join the discussion


  • This is both very dangerous and presents a fantastic opportunity (if you are good enough to take it), both for reasons that are not obvious.

    The danger is, it risks handing China a big advantage in research into areas the corporate and academic West is vacating, because those seeking money for research are focusing on areas where the language you use to land the grants is in ‘woke fashion’ and those handing out the grants look for those tickboxes. Let’s be clear, the Chinese are going to have no such compunctions, and will do us.

    The reasons for the opportunity are more subtle, and are also linked to the fact that the corporate and academic West is vacating an arena. To explain, a bit of background first. In 1905, Einstein published his series of papers that ignited the nuclear age. From that standing start, in under four decades, the US had a working A-bomb. There are a number of implied lines of future developments from that story, too complicated to discuss in detail here, that people are simply ignoring, even if they have an inkling about them. The shocking thing is that Einstein was essentially a lone individual, no stellar academic background, no high tech labs, no huge teams of high quality colleagues, no access to the endless corporate and academic money-well, who produced his masterwork as a pure construction of symbols, working in his spare time.

    Now the opportunity. People outside tech don’t fully understand this, but programming creates *precisely* the same opportunity, but open to a far bigger audience. In contrast with nuclear technologies, the programming ecosystem is far less costly, far less lethal, and much much more accessible. When nukes were first created in the 40s, the great fear was proliferation – every tinpot would acquire them. Turned out, it wasn’t that simple – the bar over which most countries couldn’t jump was science and engineering. Nukes require high-end precision mechanics, device physics and electronics. They require teams of physicists and rocket scientists. Very few countries had the necessary scientific heft, so the nuclear divide persisted, right to the early 2000s. That natural check on nuclear tech, and the fact that the US (rather than the nazis or the bolsheviks) got to nukes first, was pure luck, but people don’t see it that way. But these natural checks simply don’t exist with cutting code – anyone can do it, all you need is a laptop, the brains, and the will. Virtually everything you need to kick-start is available for free – training resources, compilers, dev tools, the lot. And the proof is the profile of many many companies of the internet age who started life in precisely this way.

    Of course the same opportunity is available to nations and to corporations – in spades. But if the corporate and academic world vacates the space for woke reasons, it becomes far easier for tiny outfits and even individuals to cash in.

    And this is the point: the next Einstein (or next small obscure team in Latvia or Luton) who creates the next algorithmic biggie, be it in natural language processing, artificial life, ai, genetic modelling, or simply invents some means to hack large numbers of computers in an automated way, is then going to be directly plugged into the core infrastructure of the entire world by default, and may have the potential to gain control of… *everything*.

  • 100 years ago, apocalyptic nutters held up signs saying the end is nigh on street corners. Today their inheritors glue themselves to motorways. Your welcome to sympathise with them Rasmus, but it doesn’t mean there is any more evidence behind these claims than 100 years ago.

  • thanks for responding Rasmus, like Galeti said this place is better for having you in it.
    I agree with pretty much everything you say in the above post, where we differ I guess is I have no trust in the credentialed climate scientists from the first step so all that follows from them I don’t trust either. I could be instantly convinced with an experiment that showed CO2 is the determining factor, but i can remember a time when all climate scientists funding depended on, and their consensus was that, the hole in the Ozone layer was the cause (of what they then called global warming, i guess now they’ve given up trying to predict if its getting warmer or colder and settled on, its just changing) they don’t mention the ozone layer so much now….
    To me this whole climate scientist endeavour looks like medieval priests telling peasants that couldn’t read what the bible says, and conveniently the bible says pay more tithes to your priest or suffer hell on earth.

  • To get involved in the discussion and stay up to date, become a registered user.

    It's simple, quick and free.

    Sign me up