X Close

Why have scientists stopped taking risks? There's a reason breakthroughs are now so rare

Great scientists don't work in isolation. Michael Williams/Getty Images

Great scientists don't work in isolation. Michael Williams/Getty Images


and
February 17, 2023   6 mins
and
February 17, 2023   6 mins

A casual consumer of scientific journalism could be forgiven for thinking that we are living in a golden age of research. Systematic evidence, however, suggests otherwise. Breakthroughs comparable to the discovery of DNA — only 70 years ago — have been all too rare in recent decades, despite massive increases in investment. Scientific work is now less likely to go in new directions, and funding agencies are less likely to bankroll more exploratory projects. Even in those areas where scientific progress is still robust, making discoveries still takes a lot more effort than it did in the past. The cost of developing new drugs, for example, now doubles every nine years.

Experts disagree on what has been holding science back. A common explanation is that potential discoveries are fewer and harder to find, absolving scientists, and institutions, from responsibility. In reality, similar complaints have been made in nearly every era, for example by late 19th-century physicists on the brink of discovering relativity. And such explanations can be self-fulfilling: it’s harder to get funding for ambitious exploratory work deemed infeasible by your peers.

To understand the slower pace of discovery, it is crucial to understand the process by which scientific breakthroughs happen. It can be illustrated by a surprisingly simple three-phase model. First, in the exploration phase, if a new scientific idea attracts the attention of enough scientists, they learn some of its key properties. Second, in the breakthrough phase, scientists learn how to utilise those key properties fruitfully in their work. Third, in the final phase, as the idea matures, advances are incremental. It still generates useful insights, but the most important ones have been exhausted; much of the work in this phase focuses on the idea’s practical applications.

Scientists are quite willing to work on ideas during the breakthrough phase — after all, everyone wants in on a project with good prospects. They are also willing to work on mature ideas, to reap the social benefits of successful ideas. But working on novel ideas exposes a scientist’s career to considerable risk, because most of them fail. This bias against exploratory science is a critical driver of the field’s stagnation, because the greatest risks often come with the greatest rewards. For example, researchers who sought to first edit genes in mammalian cells in 2011 considered CRISPR technology a risky choice, because the technique was still in many ways undeveloped. Today, by contrast, it is one of the most celebrated advances in biomedicine.

The graph below shows the development of four hypothetical ideas — A, B, C and D — through the three stages of this model. Given sustained scientific effort in the exploration phase, ideas A and B will develop into important advances; idea A’s S-curve is steeper in the breakthrough phase, meaning it is of the most significance to the broader scientific community. By contrast, ideas C and D will never amount to much, no matter how much effort is expended on them. The problem for scientists is that, in the exploration phase, the potential impact of all four ideas could appear nearly identical.

This bias against exploratory science points to a critical driver of scientific stagnation: scientists are frequently reluctant to spend their time exploring new ideas and have increasingly turned their attention to incremental science. This is backed up by quantitative evidence. University of Chicago biologist Andrey Rzhetsky and his colleagues found: “The typical research strategy used to explore chemical relationships in biomedicine… generates conservative research choices focused on building up knowledge around important molecules. These choices [have] become more conservative over time.” Another paper by the same team (led this time by UCLA sociologist Jacob Foster) also reports: “High-risk innovation strategies are rare and reflect a growing focus on established knowledge.” Meanwhile, a recent analysis by University of Arizona sociologist Russell Funk and his colleagues tracks a “marked decline in disruptive science and technology over time”, and attributes this trend to scientists relying on a narrowing set of existing knowledge.

And yet, the underlying risk of failure in exploratory research has always been a feature of scientific investigation. So why is it dictating scientists’ behaviour more than ever? In short, because of the change in the way their success is measured. The rule for research scientists used to be “publish or perish”. First articulated in the Forties, this notion designates a scientist who publishes many papers as “productive”. But in recent decades, the importance of this metric has faded. Now, the popularity of a given article trumps all, and popularity is measured by the number of times other scientific papers cite it. Like the sports statistics sites that report the batting averages of professional baseball hitters, services such as Google Scholar and Web of Science report the extent of a scientist’s influence, valuing their work based on the number of citations it has garnered. The new mantra is: be influential or be sidelined.

This fixation has decreased the incentive to engage in exploration. When a scientist ventures into an emerging area of investigation, the work is unlikely to garner many citations, because few other scientists will be working on related topics. By contrast, as a successful idea matures, the relative certainty of making discoveries — however incremental — attracts more scientists to the area, and if many people are working in that field, their work will receive more citations.

The figures below illustrate the shift in research priorities following this change in scientist incentives. Before the citation obsession, a healthy proportion of scientists was willing to engage in exploration. Today (right panel), scientists are less willing to play with new ideas, and instead pursue incremental advances. This swing in scientist effort has been costly, as fewer ideas have developed into breakthroughs.

Noubar Afeyan, a co-founder of Moderna, is one of many scientists to note that incremental advances are the norm in academia today. He urges reforms to foster a culture in which scientific leaps receive more encouragement — and are rewarded to some extent even when they fail.

One way to achieve this involves indexing the text of a research publication based on the words and word sequences that appear — a process that reveals a list of the ideas upon which an article builds. A paper that relies on more recent ideas is more likely to reflect exploratory science — and services such as Google Scholar should report novelty measures in addition to citation metrics. While those novelty measures are not perfect, they are no more flawed than citation metrics as a measure of scientific influence, and offer one more way to evaluate work. A sports page that reports only batting average would provide a very incomplete picture of the value of a home-run-hitting slugger who strikes out frequently. Scientific evaluation services should similarly avoid presenting an incomplete picture.

An alternative, and idealistic, response would be to stop measuring scientific impact at all — but in reality, this is infeasible. Scientists, like other high-status professionals, cannot escape today’s relentless performance quantification.

Moreover, when used correctly, metrics can serve a useful purpose in allocating limited research dollars. Here, too, scientific endeavour will require a shift. As well as university administrators, funding agencies should start using new novelty metrics when making hiring, tenure, and promotion decisions. Doing so will greatly increase the incentive for scientists to pursue exploratory work. Science as a discipline would become more hospitable to people who might be staying out of academia for fear that exploration is punished rather than rewarded.

Fortunately, at least in the United States, this goal enjoys cross-partisan support. The Biden administration’s first budget, for instance, included $6.5 billion for the founding of the ARPA-H funding model of biomedicine — which seeks to support “high-risk exploration that could establish entirely new paradigms”, by taking funding decisions away from influence-obsessed peer-reviewers. Still, this is merely an incremental step in the right direction; it directly affects the incentives of only a small subset of biomedical researchers engaged with the ARPA-H initiative. To truly reignite biomedical science, fundamental change is needed in the career incentives faced by a larger share of the research workforce.

The development of the mRNA Covid vaccines is an excellent example of what scientists can do when provided with the incentives to pursue novel work. It shows, too, that important discoveries are often the result of a sustained effort by a large community. Katalin Karikó, perhaps the most celebrated of the scientists who worked on the mRNA vaccines, credits hundreds of others for contributing to the effort. But these ideas only develop from infancy to breakthrough if many researchers are willing to try them out, develop them, and debate their merits. When many others are willing to take the risk of trying out new ideas, individual scientists are more likely to get involved.

The history of mRNA vaccine technology also provides evidence of the hostility offered to novel ideas within science. Key research papers related to the early development were time and again rejected by the leading scientific journals, which fear losing status if they publish too much exploratory science, because it’s less likely to go anywhere. Yet as we have seen, it is nearly impossible to understand the feasibility and significance of new ideas in their infancy, when they are still raw and poorly understood. And the rejection of innovative work by top journals renders negative judgments self-fulfilling: without the attention of other scientists, many ideas die in obscurity regardless of their potential.

It is time, then, to put science back on the path envisioned by the engineer Vannevar Bush in his 1945 report for President Roosevelt. Motivated by the loss of academic freedom during the Second World War, it emphasised the importance of open-ended exploration for scientific progress. His view that many valuable discoveries remain to be found was reflected in the title of Bush’s report, “Science: the Endless Frontier”. In the decades since, a monomaniacal focus on influence in research evaluation has fettered this exploration. If we valued it more highly, our frontiers might start to expand again.


Jay Bhattacharya is a professor at Stanford University Medical School, a physician, epidemiologist, health economist, and public health policy expert focusing on infectious diseases and vulnerable populations.

DrJBhattacharya

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

45 Comments
Most Voted
Newest Oldest
Inline Feedbacks
View all comments
J Bryant
J Bryant
1 year ago

I did biomedical research for a living almost thirty years ago. In those days, how often you appeared in the science citation index was certainly important, but publish or perish was the prevailing philosophy.
What the authors didn’t directly mention is the nature of grant funding which, so far as I know, hasn’t changed much over the past three decades. To be funded by the major granting bodies, notably NIH, you had to be established in the relevant subspecialty of research and write a proposal that relied closely on previous research and that was almost certain to yield some sort of results. In other words, especially if you were a junior faculty member, your proposal should be directed to the next obvious experiments in an established field.
There were a few sources of funding that targeted truly novel research. One was the Gates Foundation in its original iteration. At the beginning, Bill Gates and his father personally reviewed some proposals and simply wrote a check if they liked a proposal (it’s good to be a billionaire!).
I have immense respect for Prof. Bhattacharya, but I respectfully disagree with his, and his co-author’s, characterization of the mRNA vaccines as novel or breakthrough discoveries. The lipid particle technology used to deliver the mRNA is well-established and Moderna is currently being sued for patent infringement relating to that technology. Similarly, the mRNA component relies on high-purification of the synthetic mRNA and chemical modification of the mRNA to reduce its immunogenicity (immune response against the mRNA itself). Those technologies are also not new. The production of the mRNA vaccines in so short a time, and at such scale, was doubtless an impressive feat of commercial production and distribution, but it didn’t constitute Nobel-quality fundamental research, imo.

Last edited 1 year ago by J Bryant
Andrew Horsman
Andrew Horsman
1 year ago
Reply to  J Bryant

Thank you for this. Is your, or indeed the authors’, sense also that any proposal for research which might overturn or challenge a scientific “consensus” in which there is a large vested commercial interest is much less likely to receive funding than one which might confirm or elaborate on it? If this were true, it would also result in less innovation overall.

Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Andrew Horsman

I doubt it, frankly. Sure, if you are working on something that would harm a large commercial interest, some funding pools would be closed for you. But most university funding is public money, and disruptive innovation is quite popular with entrepreneurial types these days. But what the article is talking about is early days, completely novel ideas, and they are too far away and too unpredictable for big corporations to waste their energy on trying to suppress them.

The article is closer to the mark. And there is an adidtional point. If you run a research group (it takes a group) you need to build up and maintain expertise, experienced co-workers, expensive kit, and connections, and that is on the order of a decade or more. But your funding come mostly in three-year grants. So you need each fund application to produce clear results, in order to be able to get the next one. Smart groups apply for grants promising to deliver results that are already 80% complete (but unpublished) and use those three years to develop the results they are going to promise in the next round. That only works in areas where there is a reliable pipeline of regular results. If you are working on a completely new idea that will take a decade to deliver – and quite possibly will not deliver at all – you need someone to fund you on spec for the whole decade. And that is hard to get.

Ethniciodo Rodenydo
Ethniciodo Rodenydo
1 year ago
Reply to  Andrew Horsman

My grandfather was born in 1903, before the the Wright brothers’ first successful flight and when homes were still lit with oil lamps.
His lifetime (and mostly the first 60 years) saw the development of the telephone system, domestic electricity (and gas), radio, passenger air travel, television, the jet engine, atomic power, space flight, the computer, antibiotics, man landing on the moon, the internet and the mobile phone.
In the last 35 years have we seen any developments equal to these achievement? If so, I cannot bring them to mind.
It seems to me that we have taken all the low hanging fruit and that we are now well in to the land of diminishing returns. You can never discount the possibility of a lone genius seeing something that no one else has seen or would ever likely see (Dirac for example) bur today’s science seems to demand huge amounts of resource for increasingly incremental gain.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago

Most of those developments you speak were greatly spurred on by War….”The Father of all things “.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago

Most of those developments you speak were greatly spurred on by War….”The Father of all things “.

J Bryant
J Bryant
1 year ago
Reply to  Andrew Horsman

I would agree with Rasmus Fogh that it’s probably unlikely university research will be suppressed because it challenges a major commercial interest. University researchers are, however, strongly encouraged to form alliances with industry and obtain funding for “applied” projects. I can certainly believe it will be more difficult to obtain such funding for research that’s likely to undermine commercial interests.

Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Andrew Horsman

I doubt it, frankly. Sure, if you are working on something that would harm a large commercial interest, some funding pools would be closed for you. But most university funding is public money, and disruptive innovation is quite popular with entrepreneurial types these days. But what the article is talking about is early days, completely novel ideas, and they are too far away and too unpredictable for big corporations to waste their energy on trying to suppress them.

The article is closer to the mark. And there is an adidtional point. If you run a research group (it takes a group) you need to build up and maintain expertise, experienced co-workers, expensive kit, and connections, and that is on the order of a decade or more. But your funding come mostly in three-year grants. So you need each fund application to produce clear results, in order to be able to get the next one. Smart groups apply for grants promising to deliver results that are already 80% complete (but unpublished) and use those three years to develop the results they are going to promise in the next round. That only works in areas where there is a reliable pipeline of regular results. If you are working on a completely new idea that will take a decade to deliver – and quite possibly will not deliver at all – you need someone to fund you on spec for the whole decade. And that is hard to get.

Ethniciodo Rodenydo
Ethniciodo Rodenydo
1 year ago
Reply to  Andrew Horsman

My grandfather was born in 1903, before the the Wright brothers’ first successful flight and when homes were still lit with oil lamps.
His lifetime (and mostly the first 60 years) saw the development of the telephone system, domestic electricity (and gas), radio, passenger air travel, television, the jet engine, atomic power, space flight, the computer, antibiotics, man landing on the moon, the internet and the mobile phone.
In the last 35 years have we seen any developments equal to these achievement? If so, I cannot bring them to mind.
It seems to me that we have taken all the low hanging fruit and that we are now well in to the land of diminishing returns. You can never discount the possibility of a lone genius seeing something that no one else has seen or would ever likely see (Dirac for example) bur today’s science seems to demand huge amounts of resource for increasingly incremental gain.

J Bryant
J Bryant
1 year ago
Reply to  Andrew Horsman

I would agree with Rasmus Fogh that it’s probably unlikely university research will be suppressed because it challenges a major commercial interest. University researchers are, however, strongly encouraged to form alliances with industry and obtain funding for “applied” projects. I can certainly believe it will be more difficult to obtain such funding for research that’s likely to undermine commercial interests.

D.S. Huen
D.S. Huen
1 year ago
Reply to  J Bryant

I’d even say Covid-19 vaccines arose from the commercial failure of technologies originally worked on for gene therapy. Vaccines did not require the high transfection efficiencies demanded by gene therapy and proved to be a more valuable and achievable secondary target.
P.S.: I should note that term “gene therapy” as used by scientists is not restricted to permanent modification of DNA, as claimed by Full Fact when rebutting Andrew Bridgen. Bridgen made a false claim concerning mRNA vaccines but mRNA vaccines do come under gene therapy, just not for the purpose of modifying DNA. Just search Google Scholar with “mRNA gene therapy” to see it is so.

Last edited 1 year ago by D.S. Huen
Andrew Horsman
Andrew Horsman
1 year ago
Reply to  J Bryant

Thank you for this. Is your, or indeed the authors’, sense also that any proposal for research which might overturn or challenge a scientific “consensus” in which there is a large vested commercial interest is much less likely to receive funding than one which might confirm or elaborate on it? If this were true, it would also result in less innovation overall.

D.S. Huen
D.S. Huen
1 year ago
Reply to  J Bryant

I’d even say Covid-19 vaccines arose from the commercial failure of technologies originally worked on for gene therapy. Vaccines did not require the high transfection efficiencies demanded by gene therapy and proved to be a more valuable and achievable secondary target.
P.S.: I should note that term “gene therapy” as used by scientists is not restricted to permanent modification of DNA, as claimed by Full Fact when rebutting Andrew Bridgen. Bridgen made a false claim concerning mRNA vaccines but mRNA vaccines do come under gene therapy, just not for the purpose of modifying DNA. Just search Google Scholar with “mRNA gene therapy” to see it is so.

Last edited 1 year ago by D.S. Huen
J Bryant
J Bryant
1 year ago

I did biomedical research for a living almost thirty years ago. In those days, how often you appeared in the science citation index was certainly important, but publish or perish was the prevailing philosophy.
What the authors didn’t directly mention is the nature of grant funding which, so far as I know, hasn’t changed much over the past three decades. To be funded by the major granting bodies, notably NIH, you had to be established in the relevant subspecialty of research and write a proposal that relied closely on previous research and that was almost certain to yield some sort of results. In other words, especially if you were a junior faculty member, your proposal should be directed to the next obvious experiments in an established field.
There were a few sources of funding that targeted truly novel research. One was the Gates Foundation in its original iteration. At the beginning, Bill Gates and his father personally reviewed some proposals and simply wrote a check if they liked a proposal (it’s good to be a billionaire!).
I have immense respect for Prof. Bhattacharya, but I respectfully disagree with his, and his co-author’s, characterization of the mRNA vaccines as novel or breakthrough discoveries. The lipid particle technology used to deliver the mRNA is well-established and Moderna is currently being sued for patent infringement relating to that technology. Similarly, the mRNA component relies on high-purification of the synthetic mRNA and chemical modification of the mRNA to reduce its immunogenicity (immune response against the mRNA itself). Those technologies are also not new. The production of the mRNA vaccines in so short a time, and at such scale, was doubtless an impressive feat of commercial production and distribution, but it didn’t constitute Nobel-quality fundamental research, imo.

Last edited 1 year ago by J Bryant
hayden eastwood
hayden eastwood
1 year ago

This article really strikes a chord with me.

Once upon a time I worked in research biophysics. I really loved the PhD I did because I was left alone and allowed to explore an avenue that was interesting to me. But my experience was the exception rather than the rule and, I think, based on having a very kind and decent supervisor who protected me from the indignities of academia.
But once I finished and went into the postdoc cycle I had a rude awakening for all the reasons outlined above. But there were yet other reasons, not covered in this article:

First, I worked in a small research group in a narrow area of exploration where there were only 2-3 other labs in the World doing the same work. This meant that our scientific competitors were also our peer reviewers. This was not a good situation for obvious reasons.

Second, salaries were very low compared to industry, often 1/3rd or less compared to tech or finance. Add to that that most postdocs were funded, to save money, as “student scholarships”. This together with the contract cycle being in the region of 2-3 years maximum, made it impossible to get a mortgage. This was one reason that, when I left science, it was well on its way to becoming a vocation for those with old money, rather than one for those most passionate about discovery.
Finally, I saw that academia was becoming more and more dominated by “goodie-goodies” motivated by Noddy badges (publication count), over and above interest in discovery itself. Watson and Crick hadn’t completed their PhDs at the time that they did their groundbreaking work – if they had done their PhDs when I did, they would have had to cut their research short after 3 years, write up a “lessons learned” piece, and likely not find a postdoc because they’d have not published anything. How many creative types capable of Nobel Prize work have been screened out in our quest for goodie-goodies whose minds are optimised for social recognition points based on low-creativity publication counts?

Last edited 1 year ago by hayden eastwood
Johann Strauss
Johann Strauss
1 year ago

A minor correction: Watson had his PhD in hand when he went to the Cavendish for his postdoc. It was Crick who hadn’t yet got his PhD. But Crick was older owing to WWII. Further, Crick’s PhD work had nothing to do with the DNA double helix, although it was very helpful to figure out what was going on and to interpret Rosalind Franklin’s fiber diffraction data (which incidentally they failed to acknowledge or mention in the classic Nature paper but rather referred to Astbury’s data from the 30s which was impossible to interpret as it was obtained from a mix of B and A DNA fibers.

Julian Pellatt
Julian Pellatt
1 year ago

Good insights, Hayden! I wonder to what extent the rapid onset of the woke worldview has submerged scientific endeavour in the same glutinous mire that has choked originality, risk-taking and free expression in all other dimensions of human existence? I suspect the answer is: “Yes!”

Johann Strauss
Johann Strauss
1 year ago

A minor correction: Watson had his PhD in hand when he went to the Cavendish for his postdoc. It was Crick who hadn’t yet got his PhD. But Crick was older owing to WWII. Further, Crick’s PhD work had nothing to do with the DNA double helix, although it was very helpful to figure out what was going on and to interpret Rosalind Franklin’s fiber diffraction data (which incidentally they failed to acknowledge or mention in the classic Nature paper but rather referred to Astbury’s data from the 30s which was impossible to interpret as it was obtained from a mix of B and A DNA fibers.

Julian Pellatt
Julian Pellatt
1 year ago

Good insights, Hayden! I wonder to what extent the rapid onset of the woke worldview has submerged scientific endeavour in the same glutinous mire that has choked originality, risk-taking and free expression in all other dimensions of human existence? I suspect the answer is: “Yes!”

hayden eastwood
hayden eastwood
1 year ago

This article really strikes a chord with me.

Once upon a time I worked in research biophysics. I really loved the PhD I did because I was left alone and allowed to explore an avenue that was interesting to me. But my experience was the exception rather than the rule and, I think, based on having a very kind and decent supervisor who protected me from the indignities of academia.
But once I finished and went into the postdoc cycle I had a rude awakening for all the reasons outlined above. But there were yet other reasons, not covered in this article:

First, I worked in a small research group in a narrow area of exploration where there were only 2-3 other labs in the World doing the same work. This meant that our scientific competitors were also our peer reviewers. This was not a good situation for obvious reasons.

Second, salaries were very low compared to industry, often 1/3rd or less compared to tech or finance. Add to that that most postdocs were funded, to save money, as “student scholarships”. This together with the contract cycle being in the region of 2-3 years maximum, made it impossible to get a mortgage. This was one reason that, when I left science, it was well on its way to becoming a vocation for those with old money, rather than one for those most passionate about discovery.
Finally, I saw that academia was becoming more and more dominated by “goodie-goodies” motivated by Noddy badges (publication count), over and above interest in discovery itself. Watson and Crick hadn’t completed their PhDs at the time that they did their groundbreaking work – if they had done their PhDs when I did, they would have had to cut their research short after 3 years, write up a “lessons learned” piece, and likely not find a postdoc because they’d have not published anything. How many creative types capable of Nobel Prize work have been screened out in our quest for goodie-goodies whose minds are optimised for social recognition points based on low-creativity publication counts?

Last edited 1 year ago by hayden eastwood
Chris W
Chris W
1 year ago

Three points:

1) The Eureka moment is not something that just happens. It comes after years of immersion in a problem. Today those years would be very costly – more likely that the scientist would be moved on to pastures new.

2) Times have changed and commercial pressures are different. If a scientist discovered a new battery for cellphones which prolonged their life indefinitely, the manufacurers would suppress the development because planned obsolesence is a critical part of manufacturing.

3) Confusing the issue and the graphs we have a whole new type of scientist, one who calls himself a Social Scientist. These are people who have no scientific training but they still publish papers as pseudo-scientists. Their only skill, in fact, is to manipulate statistics. For every paper announcing a true scientific discovery there are probably ten later ‘vulture’ papers by Social Scientists who just move the figures around. This gives science a bad name and can be a disincentive for real scientists.

Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Chris W

1) very true.
2) The article is talking about new ideas – the kind that would eventually make it possible to build a better battery. Manufacturers do not supress at the level of basic science – neither IBM nor KODAK nor the Swiss watch industry tried to suppress the idea of the microchip.

Chris W
Chris W
1 year ago
Reply to  Rasmus Fogh

You have to choose the right example to make the point.
New types of packaging, those which are better for the environment, are suppressed by supermarkets because they don’t want shorter storage windows.
My friend worked for a company which developed a process for extending the life of re-inforced concrete, thereby saving the world millions of dollars. This was suppressed because the company was taken over by another which had its own ‘product’.
People have come up with discoveries which will remove carbon from the atmosphere, meaning that we don’t have to suffer with the ‘No Oil’ brigade. These were hushed up because (politically) we don’t want to be reliant on Russia and the Middle-East.

Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Chris W

Exactly. You are talking about product development, invention, which is well in the realm of ‘incremental development’. You certainly have a point, there, but the article is talking about basic science, exploration, which is far upstream from actual products.

Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Chris W

Exactly. You are talking about product development, invention, which is well in the realm of ‘incremental development’. You certainly have a point, there, but the article is talking about basic science, exploration, which is far upstream from actual products.

Chris W
Chris W
1 year ago
Reply to  Rasmus Fogh

You have to choose the right example to make the point.
New types of packaging, those which are better for the environment, are suppressed by supermarkets because they don’t want shorter storage windows.
My friend worked for a company which developed a process for extending the life of re-inforced concrete, thereby saving the world millions of dollars. This was suppressed because the company was taken over by another which had its own ‘product’.
People have come up with discoveries which will remove carbon from the atmosphere, meaning that we don’t have to suffer with the ‘No Oil’ brigade. These were hushed up because (politically) we don’t want to be reliant on Russia and the Middle-East.

AJ Mac
AJ Mac
1 year ago
Reply to  Chris W

Excellent point. If the research is focused on finding an innovative algorithm to keep teenagers more glued to their phones or “discovering” the perfect b***r pill, we’re not exactly in pure exploration and discovery mode.
The suppression of true advances and breakthroughs in medicine, for example, is an actual, real-impact problem. And some people, even scientists with a very high IQ that is joined to a working conscience, will have a timid, conventional, or institutionally-shackled approach. No matter how much energy and money is supplied to counteract the presence of too much greed and misdirected energy.
But why do we always seem to be swallowing the spider to catch the fly, instead of controlling the fly infestations we’ve already unleashed?
We need a brave new harm-reduction incrementalism! Less Worthless Science Now ! (Make a t-shirt outta that).
I’m pretty sure the end products would suck less or present less damaging implications, as a rule, if the focus were less commercial and disruption happy. You wanna change the world for the better? Then don’t publish your trivial or nihilistic research in the name of Pure or Open Science. And reject ludicrous wealth or personal fame as primary motives. (Please!).
{curious editorial logic that permits “bullshit” to print but disallows disallows a commonplace reference to “male enhancement”}

Last edited 1 year ago by AJ Mac
Rasmus Fogh
Rasmus Fogh
1 year ago
Reply to  Chris W

1) very true.
2) The article is talking about new ideas – the kind that would eventually make it possible to build a better battery. Manufacturers do not supress at the level of basic science – neither IBM nor KODAK nor the Swiss watch industry tried to suppress the idea of the microchip.

AJ Mac
AJ Mac
1 year ago
Reply to  Chris W

Excellent point. If the research is focused on finding an innovative algorithm to keep teenagers more glued to their phones or “discovering” the perfect b***r pill, we’re not exactly in pure exploration and discovery mode.
The suppression of true advances and breakthroughs in medicine, for example, is an actual, real-impact problem. And some people, even scientists with a very high IQ that is joined to a working conscience, will have a timid, conventional, or institutionally-shackled approach. No matter how much energy and money is supplied to counteract the presence of too much greed and misdirected energy.
But why do we always seem to be swallowing the spider to catch the fly, instead of controlling the fly infestations we’ve already unleashed?
We need a brave new harm-reduction incrementalism! Less Worthless Science Now ! (Make a t-shirt outta that).
I’m pretty sure the end products would suck less or present less damaging implications, as a rule, if the focus were less commercial and disruption happy. You wanna change the world for the better? Then don’t publish your trivial or nihilistic research in the name of Pure or Open Science. And reject ludicrous wealth or personal fame as primary motives. (Please!).
{curious editorial logic that permits “bullshit” to print but disallows disallows a commonplace reference to “male enhancement”}

Last edited 1 year ago by AJ Mac
Chris W
Chris W
1 year ago

Three points:

1) The Eureka moment is not something that just happens. It comes after years of immersion in a problem. Today those years would be very costly – more likely that the scientist would be moved on to pastures new.

2) Times have changed and commercial pressures are different. If a scientist discovered a new battery for cellphones which prolonged their life indefinitely, the manufacurers would suppress the development because planned obsolesence is a critical part of manufacturing.

3) Confusing the issue and the graphs we have a whole new type of scientist, one who calls himself a Social Scientist. These are people who have no scientific training but they still publish papers as pseudo-scientists. Their only skill, in fact, is to manipulate statistics. For every paper announcing a true scientific discovery there are probably ten later ‘vulture’ papers by Social Scientists who just move the figures around. This gives science a bad name and can be a disincentive for real scientists.

Prashant Kotak
Prashant Kotak
1 year ago

There is a deeper question posed by the current stalling, about the nature of discovery, invention, originality and the conditions in which individuals arise who produce these.

The question I would like answered is, why, when there are more highly educated people on earth right now by literally orders of magnitude, is the Mathematics, Science and Literary output from the past not completely swamped by the sheer volume of new stuff produced in the last few decades? We are still looking, constantly, at not just Shakespeare and Newton and Pascal and Euler, but as much at those old Greeks and Romans from antiquity. The glib answer is of course, not anywhere near the volume of new stuff that you might expect has been produced, but my question is, why not, when the population has ramped?

The first thing to acknowledge, is that it seems the ‘numbers game’ doesn’t quite work – or at least not in a way that is a straightforward demographic extrapolation. You would expect, as the global population ramped up and education levels rose by orders of magnitude over the last couple of centuries, and more and more people came into the sciences in general, that there would be increasing numbers of people who come up with original new theories and proofs, or backing evidence, not just in the sciences but every possible domain including the humanities and the arts. And this kinda happened, up until the middle of the 20th century, and then the correlation seems to break down completely.

To expand on what I mean, the global population in antiquity was tiny compared to now, and the numbers of highly educated people absolutely miniscule, yet it was the Greeks who threw up, across just four or five centuries, the string of abstract thinkers from Aristotle to Archimedes to Euclid to Plato to Pythagoras to Thales, and literally dozens of others less well known but many equally profound, whose original output is still taught day in day out in schools. Not that most of those guys didn’t believe some pretty pretty odd things, but their original discoveries and inventions are still at the surface of today’s consciousness. Proving, if nothing else, that general intelligence is no guarantee of what we consider rational (and I mean rational in the ‘enlightenment’ sense, not the ‘technocratic tribune’ sense of ‘you must wear masks because not to do so is irrational, even though just yesterday I was laughing at you disdainfully for suggesting that masks might help stop the spread of bugs’). Also noteworthy is that the Romans, who (crudely speaking) conquered the Greeks, did *not* produce a string of similar thinkers in the same domains, and although we are still looking on in awe at Roman artefacts and literary production (Tacitus, and Suetonius and Virgil and Ovid and Catullus and so on) arguably that all is pretty thin gruel compared to the output of the neighbouring Greeks. Interesting also, the string of Italianate creatives and thinkers eventually did come of course, but over a thousand years after the dissolution of that empire in antiquity.

Anyway, back to the point I was trying to make before that rambling digression, in my fields of professional interest, Comp Sci and Electronics, the focus nowadays is mostly around producing tools for enabling engineering solutions. The bulk of the fundamental breakthroughs, in both computing mathematics and physics, all came pretty much across the first half of the 20th century, and thereafter the focus is increasingly on first application engineering, and then more recently, complexity management around that engineering effort – the fundamental breakthroughs have dried up. No recent equivalent for example, of the logic and computation theory aimed at answering ‘Entscheidungsproblem’, which Turing and Church and Godel and others produced literally decades ago. I would also note, as a clue, that Turing’s model turned out to be equivalent to Church’s model, and Post’s model, and Godel’s model, even though all those mathematicians and logicians all independently produced their models around the same time and were at first glance seemingly very different indeed.

Another observation I would make, is that through history we see ‘clumps’ of talent emerge, some of which is super high-end, surrounded by mostly barren periods, and it doesn’t seem to be a function of population density at all, but rather some undefinable, transitory, quality of a society at a point in time. Note that even in antiquity, it was not the dense population centers of south and east Asia where the flowerings of talent emerged, but in the first instance around corners of the Med, followed, much later, by nations with cold, hard, difficult geographies in northern Europe. Most astonishingly, we had the simultaneously resented, and admired, success and sheer volume of extraordinary souls produced by ‘rainy fascist island’ as MH might have put it, over four odd centuries, for no rhyme or reason that at least I can discern. I mean, why on earth this dratted little sceptered isle? What’s so special about it?

Mainland Europe, as a last hurrah, produced the ‘Martians’ and the ‘Vienna Circle’ (which it then pretty much gifted away to the United States), just as Europe decided to embark on a couple of massive self-destructive and self-impoverishing wars. Will it produce a second coming?

Last edited 1 year ago by Prashant Kotak
AJ Mac
AJ Mac
1 year ago
Reply to  Prashant Kotak

Thanks for your analysis and reflections. At your conclusion you conjure (for me at least) an image from the end of the Keats Yeats {I’ve done this at least twice now; I know the difference between John Keats and W.B. Yeats but part of my brain doesn’t} poem “Second Coming” (published about 100 years ago):
And what rough beast, its hour come round at last, / Slouches toward Bethlehem to be born?
Perhaps we cannot manufacture innovation nor think our way out of all our problems. We can be quite sure that reincarnated versions (if you will) of Euclid or Pascal or Newton would not hold the exact same views as their antique, documented selves. But could they even remain within institutions or have their ideas received in their own lifetimes while holding the abstract, yet non-materialistic views (“odd things”) you’ve alluded to?
How many Galileos can we expect or claim to deserve, even putting aside the notion that we’ve discovered so much already? By which I’m suggesting: Yes, superstition and churchy restrictions are no longer a real hindrance to science, but there are major establishment norms and assumptions that suppress free scientific innovation, two of which might be termed “materialistic”: a focus on money and a widespread refusal to even entertain the presence of anything nonmaterial in the universe. So even things such as dark matter and human consciousness must (and will, “they” insist) be explained away by our own birth-less, un-free-willed, and intrinsic-purpose-free scientific lights.
Maybe some of today’s innovators are fired professors or Phd-dropouts living off the grid. Let’s just hope they don’t go Unabomber, before or after their research findings and manifestos are published.

Last edited 1 year ago by AJ Mac
Steve Murray
Steve Murray
1 year ago
Reply to  AJ Mac

Superb thoughts. Thankyou.

AJ Mac
AJ Mac
1 year ago
Reply to  Steve Murray

That’s kind of you. Thanks.

AJ Mac
AJ Mac
1 year ago
Reply to  Steve Murray

That’s kind of you. Thanks.

Steve Murray
Steve Murray
1 year ago
Reply to  AJ Mac

Superb thoughts. Thankyou.

J Bryant
J Bryant
1 year ago
Reply to  Prashant Kotak

Great post and, yes, the questions you raise about creativity and originality, are very interesting.
I suspect part of the answer is trivial in the sense that much groundbreaking research follows one or two fundamental discoveries that open important new areas of knowledge. The early 20th century is the obvious example. You mentioned the “Martians” which is a term I suspect many people will not know refers to a small group of mainly Jewish, Hungarian refugees from the Nazis who moved to the West and made significant contributions to math and physics. Von Neumann is probably the most famous. These people built on the relatively new fields of quantum mechanics and theoretical computer science. Some also worked out the math of a nuclear (fission) bomb. The tools were at hand, the new ideas were in place, and the gifted Martians ran with them.
What truly fundamental discoveries have we made in the past thirty years? What new, deep problems have appeared that are capable of being solved by the right people? My training was in chemistry/biochemistry. I remember a professor telling us, in the 1980s, that chemistry was essentially complete by the turn of the 20th century. Ouch! Not what grad students expected to hear. But he was correct in a way. Look at the chemistry/biochem/molecular biology literature today and it’s crammed with papers dotting i’s and crossing t’s. Even the Nobel prizes now seem to be awarded for “major advances” that are, in fact, much smaller in scope and significance.
Perhaps there is truth in the idea that we’re reaching the limits of human understanding, but I suppose people were saying that pre-Bohr/Einstein/Heisenberg.

Prashant Kotak
Prashant Kotak
1 year ago
Reply to  J Bryant

The last point you make, about reaching limits, is interesting. I can see where you are coming from re the sciences, but I would then not expect that phenomenon to spill over into the arts. But I can point to the same type of drought, over at least the last half century, in many areas of the high arts – poetry and literature for example. To clarify, I mean by this a profile where there are many many in the ‘excellent’ category, and a drought in the ‘outright genius’ category.

To illustrate, let me link back to the ‘numbers game’ point I was making in my first post. I will localise to England, but the argument can be extended much wider. Between say, 1000 AD and 2000 AD, England produced Shakespeare in the late 1500s when population was around 3.5 million. Zooming forward to the mid 1800s by when England’s population was around 15 million, England had produced literally dozens of seminal figures in literature and poetry, including the Romantics etc, and the trend continues, through to the middle of the 20th century, and then… the production line of geniuses dries up. I don’t think there was any period since 1500 when there were not at least two towering poets in operation, and there were periods when up to half a dozen were around all at the same time. The period between 1900 and 1950 for example, had Auden and Eliot, and if I include all the British isles, Yeats and MacNeice, and of course a whole raft almost as good but just a rung below, like Chesterton, Dylan Thomas and so on.

My point is, the second half of the 20th century and onwards, when the population was 50 million and increasing, didn’t seem to produce anyone in quite the same class as those earlier figures, and I certainly don’t see any absolutely towering poet around right now, although there are plenty of very good ones around. But if 3.5 million produced a Shakespeare (and a Marlowe), then with 65 million now, I am owed, by sheer numbers, right here right now, (looks down, checks figures), 2 Shakespeares, 4 Byrons, 4 Eliots, 5 Dickens, 7 Larkins, 10 each of Sassoon and Owen, and so on.

And my question is, where are they all? Instead, someone seems to be attempting to palm me off with several thousands of Ian McEwans, and Margaret Atwoods and Sally Rooneys, etc – middling to good, and sometimes even excellent, but no Shakespeares. And I have to admit to feeling just a tad short changed here.

Last edited 1 year ago by Prashant Kotak
Prashant Kotak
Prashant Kotak
1 year ago
Reply to  J Bryant

.

Last edited 1 year ago by Prashant Kotak
Prashant Kotak
Prashant Kotak
1 year ago
Reply to  J Bryant

The last point you make, about reaching limits, is interesting. I can see where you are coming from re the sciences, but I would then not expect that phenomenon to spill over into the arts. But I can point to the same type of drought, over at least the last half century, in many areas of the high arts – poetry and literature for example. To clarify, I mean by this a profile where there are many many in the ‘excellent’ category, and a drought in the ‘outright genius’ category.

To illustrate, let me link back to the ‘numbers game’ point I was making in my first post. I will localise to England, but the argument can be extended much wider. Between say, 1000 AD and 2000 AD, England produced Shakespeare in the late 1500s when population was around 3.5 million. Zooming forward to the mid 1800s by when England’s population was around 15 million, England had produced literally dozens of seminal figures in literature and poetry, including the Romantics etc, and the trend continues, through to the middle of the 20th century, and then… the production line of geniuses dries up. I don’t think there was any period since 1500 when there were not at least two towering poets in operation, and there were periods when up to half a dozen were around all at the same time. The period between 1900 and 1950 for example, had Auden and Eliot, and if I include all the British isles, Yeats and MacNeice, and of course a whole raft almost as good but just a rung below, like Chesterton, Dylan Thomas and so on.

My point is, the second half of the 20th century and onwards, when the population was 50 million and increasing, didn’t seem to produce anyone in quite the same class as those earlier figures, and I certainly don’t see any absolutely towering poet around right now, although there are plenty of very good ones around. But if 3.5 million produced a Shakespeare (and a Marlowe), then with 65 million now, I am owed, by sheer numbers, right here right now, (looks down, checks figures), 2 Shakespeares, 4 Byrons, 4 Eliots, 5 Dickens, 7 Larkins, 10 each of Sassoon and Owen, and so on.

And my question is, where are they all? Instead, someone seems to be attempting to palm me off with several thousands of Ian McEwans, and Margaret Atwoods and Sally Rooneys, etc – middling to good, and sometimes even excellent, but no Shakespeares. And I have to admit to feeling just a tad short changed here.

Last edited 1 year ago by Prashant Kotak
Prashant Kotak
Prashant Kotak
1 year ago
Reply to  J Bryant

.

Last edited 1 year ago by Prashant Kotak
Steve Murray
Steve Murray
1 year ago
Reply to  Prashant Kotak

Can i second the thanks that AJ Mac proffers for the absolute vitality of your contribution to this debate?
As a non-scientist, but someone who takes great interest in human creativity in general and seeks to understand scientific thought (such as quantum mechanics, string theory) without the foundational education in physics, your overview strikes me as essentially true and encompasses pretty much the history of human thought. It’s for such contributions that i subscribe to Unherd, and therefore will also echo those who’ve praised the original article whilst also critiquing it.

Prashant Kotak
Prashant Kotak
1 year ago
Reply to  Steve Murray

Thank you for the kind words!

AJ Mac
AJ Mac
1 year ago
Reply to  Steve Murray

Seconded. My comments didn’t really belong with Mr. Kotak’s but I tend not to let that stop me. There are many brilliant and well-informed commenters at UnHerd, some of them experienced professionals and published academics or authors who provide insightful views you’d probably not encounter–at least in this conversational way, with an opportunity to engage them directly–in their more “official” work.

Prashant Kotak
Prashant Kotak
1 year ago
Reply to  Steve Murray

Thank you for the kind words!

AJ Mac
AJ Mac
1 year ago
Reply to  Steve Murray

Seconded. My comments didn’t really belong with Mr. Kotak’s but I tend not to let that stop me. There are many brilliant and well-informed commenters at UnHerd, some of them experienced professionals and published academics or authors who provide insightful views you’d probably not encounter–at least in this conversational way, with an opportunity to engage them directly–in their more “official” work.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

The answer to your interesting question is that we have NOT being following the Darwinian Imperative of ‘Survival of the Fittest’ but rather the suicidal policy of mass procreation of the dross, to lapse into the vernacular.

It will NOT end well, but fortunately I shall not be around to see it.

AJ Mac
AJ Mac
1 year ago

Though our views seldom intersect, I hope you’ll not be leaving your digital pals too soon. You have a lot to say, some of which I agree with, or take amusement & instruction from.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  AJ Mac

Thank you.
I’ve just ‘seen off’ a brief onslaught of COVID-24, which necessitated an unprecedented 36 hours in bed! But thanks to my abuse of a ‘wonder drug’ I am now fighting fit again. (Much to the relief of my dogs!)

Last edited 1 year ago by CHARLES STANHOPE
CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  AJ Mac

Thank you.
I’ve just ‘seen off’ a brief onslaught of COVID-24, which necessitated an unprecedented 36 hours in bed! But thanks to my abuse of a ‘wonder drug’ I am now fighting fit again. (Much to the relief of my dogs!)

Last edited 1 year ago by CHARLES STANHOPE
AJ Mac
AJ Mac
1 year ago

Though our views seldom intersect, I hope you’ll not be leaving your digital pals too soon. You have a lot to say, some of which I agree with, or take amusement & instruction from.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

Sadly if the Romans had not rejected the ideas of Heron of Alexandria & Co, we would have been on the Moon 500 years ago, and probably achieved Armageddon as well.

Prashant Kotak
Prashant Kotak
1 year ago

I have often wondered why the Roman Empire in fact dissolved, instead of kicking on into technological advances. As in, all sorts of historians have put up all sorts of theses, and while I was once upon a time willing to say “Oh, ok” to all the explanations put forward, I can’t say I buy any of them these days.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

You presumably are aware of Suetonius’s account of how Vespasian rejected technology in order to ‘feed the poor’?

“mechanico quoque grandis columnas exigua impensa perducturum in Capitolium pollicenti praemium pro commento non mediocre optulit, operam remisit praefatus sineret se plebiculam pascere.”* Which translates as:-

“To a mechanical engineer, who promised to transport some heavy columns to the Capitol at small expense, he gave no mean reward for his invention, but refused to make use of it, saying: “You must let me feed my poor people .”

Things were a little different at Wheal Vor in circa 1709 however !

(*Suetonius: Life of Vespasian. XVIII.)

Last edited 1 year ago by CHARLES STANHOPE
Prashant Kotak
Prashant Kotak
1 year ago

Thank you for pointing me to that passage. I wasn’t aware of that from Suetonius – I have read all sorts of bits and pieces of various Roman writers in translation, but none end to end or in depth.

Prashant Kotak
Prashant Kotak
1 year ago

Thank you for pointing me to that passage. I wasn’t aware of that from Suetonius – I have read all sorts of bits and pieces of various Roman writers in translation, but none end to end or in depth.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

You presumably are aware of Suetonius’s account of how Vespasian rejected technology in order to ‘feed the poor’?

“mechanico quoque grandis columnas exigua impensa perducturum in Capitolium pollicenti praemium pro commento non mediocre optulit, operam remisit praefatus sineret se plebiculam pascere.”* Which translates as:-

“To a mechanical engineer, who promised to transport some heavy columns to the Capitol at small expense, he gave no mean reward for his invention, but refused to make use of it, saying: “You must let me feed my poor people .”

Things were a little different at Wheal Vor in circa 1709 however !

(*Suetonius: Life of Vespasian. XVIII.)

Last edited 1 year ago by CHARLES STANHOPE
Prashant Kotak
Prashant Kotak
1 year ago

I have often wondered why the Roman Empire in fact dissolved, instead of kicking on into technological advances. As in, all sorts of historians have put up all sorts of theses, and while I was once upon a time willing to say “Oh, ok” to all the explanations put forward, I can’t say I buy any of them these days.

AJ Mac
AJ Mac
1 year ago
Reply to  Prashant Kotak

Thanks for your analysis and reflections. At your conclusion you conjure (for me at least) an image from the end of the Keats Yeats {I’ve done this at least twice now; I know the difference between John Keats and W.B. Yeats but part of my brain doesn’t} poem “Second Coming” (published about 100 years ago):
And what rough beast, its hour come round at last, / Slouches toward Bethlehem to be born?
Perhaps we cannot manufacture innovation nor think our way out of all our problems. We can be quite sure that reincarnated versions (if you will) of Euclid or Pascal or Newton would not hold the exact same views as their antique, documented selves. But could they even remain within institutions or have their ideas received in their own lifetimes while holding the abstract, yet non-materialistic views (“odd things”) you’ve alluded to?
How many Galileos can we expect or claim to deserve, even putting aside the notion that we’ve discovered so much already? By which I’m suggesting: Yes, superstition and churchy restrictions are no longer a real hindrance to science, but there are major establishment norms and assumptions that suppress free scientific innovation, two of which might be termed “materialistic”: a focus on money and a widespread refusal to even entertain the presence of anything nonmaterial in the universe. So even things such as dark matter and human consciousness must (and will, “they” insist) be explained away by our own birth-less, un-free-willed, and intrinsic-purpose-free scientific lights.
Maybe some of today’s innovators are fired professors or Phd-dropouts living off the grid. Let’s just hope they don’t go Unabomber, before or after their research findings and manifestos are published.

Last edited 1 year ago by AJ Mac
J Bryant
J Bryant
1 year ago
Reply to  Prashant Kotak

Great post and, yes, the questions you raise about creativity and originality, are very interesting.
I suspect part of the answer is trivial in the sense that much groundbreaking research follows one or two fundamental discoveries that open important new areas of knowledge. The early 20th century is the obvious example. You mentioned the “Martians” which is a term I suspect many people will not know refers to a small group of mainly Jewish, Hungarian refugees from the Nazis who moved to the West and made significant contributions to math and physics. Von Neumann is probably the most famous. These people built on the relatively new fields of quantum mechanics and theoretical computer science. Some also worked out the math of a nuclear (fission) bomb. The tools were at hand, the new ideas were in place, and the gifted Martians ran with them.
What truly fundamental discoveries have we made in the past thirty years? What new, deep problems have appeared that are capable of being solved by the right people? My training was in chemistry/biochemistry. I remember a professor telling us, in the 1980s, that chemistry was essentially complete by the turn of the 20th century. Ouch! Not what grad students expected to hear. But he was correct in a way. Look at the chemistry/biochem/molecular biology literature today and it’s crammed with papers dotting i’s and crossing t’s. Even the Nobel prizes now seem to be awarded for “major advances” that are, in fact, much smaller in scope and significance.
Perhaps there is truth in the idea that we’re reaching the limits of human understanding, but I suppose people were saying that pre-Bohr/Einstein/Heisenberg.

Steve Murray
Steve Murray
1 year ago
Reply to  Prashant Kotak

Can i second the thanks that AJ Mac proffers for the absolute vitality of your contribution to this debate?
As a non-scientist, but someone who takes great interest in human creativity in general and seeks to understand scientific thought (such as quantum mechanics, string theory) without the foundational education in physics, your overview strikes me as essentially true and encompasses pretty much the history of human thought. It’s for such contributions that i subscribe to Unherd, and therefore will also echo those who’ve praised the original article whilst also critiquing it.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

The answer to your interesting question is that we have NOT being following the Darwinian Imperative of ‘Survival of the Fittest’ but rather the suicidal policy of mass procreation of the dross, to lapse into the vernacular.

It will NOT end well, but fortunately I shall not be around to see it.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Prashant Kotak

Sadly if the Romans had not rejected the ideas of Heron of Alexandria & Co, we would have been on the Moon 500 years ago, and probably achieved Armageddon as well.

Prashant Kotak
Prashant Kotak
1 year ago

There is a deeper question posed by the current stalling, about the nature of discovery, invention, originality and the conditions in which individuals arise who produce these.

The question I would like answered is, why, when there are more highly educated people on earth right now by literally orders of magnitude, is the Mathematics, Science and Literary output from the past not completely swamped by the sheer volume of new stuff produced in the last few decades? We are still looking, constantly, at not just Shakespeare and Newton and Pascal and Euler, but as much at those old Greeks and Romans from antiquity. The glib answer is of course, not anywhere near the volume of new stuff that you might expect has been produced, but my question is, why not, when the population has ramped?

The first thing to acknowledge, is that it seems the ‘numbers game’ doesn’t quite work – or at least not in a way that is a straightforward demographic extrapolation. You would expect, as the global population ramped up and education levels rose by orders of magnitude over the last couple of centuries, and more and more people came into the sciences in general, that there would be increasing numbers of people who come up with original new theories and proofs, or backing evidence, not just in the sciences but every possible domain including the humanities and the arts. And this kinda happened, up until the middle of the 20th century, and then the correlation seems to break down completely.

To expand on what I mean, the global population in antiquity was tiny compared to now, and the numbers of highly educated people absolutely miniscule, yet it was the Greeks who threw up, across just four or five centuries, the string of abstract thinkers from Aristotle to Archimedes to Euclid to Plato to Pythagoras to Thales, and literally dozens of others less well known but many equally profound, whose original output is still taught day in day out in schools. Not that most of those guys didn’t believe some pretty pretty odd things, but their original discoveries and inventions are still at the surface of today’s consciousness. Proving, if nothing else, that general intelligence is no guarantee of what we consider rational (and I mean rational in the ‘enlightenment’ sense, not the ‘technocratic tribune’ sense of ‘you must wear masks because not to do so is irrational, even though just yesterday I was laughing at you disdainfully for suggesting that masks might help stop the spread of bugs’). Also noteworthy is that the Romans, who (crudely speaking) conquered the Greeks, did *not* produce a string of similar thinkers in the same domains, and although we are still looking on in awe at Roman artefacts and literary production (Tacitus, and Suetonius and Virgil and Ovid and Catullus and so on) arguably that all is pretty thin gruel compared to the output of the neighbouring Greeks. Interesting also, the string of Italianate creatives and thinkers eventually did come of course, but over a thousand years after the dissolution of that empire in antiquity.

Anyway, back to the point I was trying to make before that rambling digression, in my fields of professional interest, Comp Sci and Electronics, the focus nowadays is mostly around producing tools for enabling engineering solutions. The bulk of the fundamental breakthroughs, in both computing mathematics and physics, all came pretty much across the first half of the 20th century, and thereafter the focus is increasingly on first application engineering, and then more recently, complexity management around that engineering effort – the fundamental breakthroughs have dried up. No recent equivalent for example, of the logic and computation theory aimed at answering ‘Entscheidungsproblem’, which Turing and Church and Godel and others produced literally decades ago. I would also note, as a clue, that Turing’s model turned out to be equivalent to Church’s model, and Post’s model, and Godel’s model, even though all those mathematicians and logicians all independently produced their models around the same time and were at first glance seemingly very different indeed.

Another observation I would make, is that through history we see ‘clumps’ of talent emerge, some of which is super high-end, surrounded by mostly barren periods, and it doesn’t seem to be a function of population density at all, but rather some undefinable, transitory, quality of a society at a point in time. Note that even in antiquity, it was not the dense population centers of south and east Asia where the flowerings of talent emerged, but in the first instance around corners of the Med, followed, much later, by nations with cold, hard, difficult geographies in northern Europe. Most astonishingly, we had the simultaneously resented, and admired, success and sheer volume of extraordinary souls produced by ‘rainy fascist island’ as MH might have put it, over four odd centuries, for no rhyme or reason that at least I can discern. I mean, why on earth this dratted little sceptered isle? What’s so special about it?

Mainland Europe, as a last hurrah, produced the ‘Martians’ and the ‘Vienna Circle’ (which it then pretty much gifted away to the United States), just as Europe decided to embark on a couple of massive self-destructive and self-impoverishing wars. Will it produce a second coming?

Last edited 1 year ago by Prashant Kotak
AJ Mac
AJ Mac
1 year ago

“Novelty metrics”? Perhaps that can be married to a “disruption quotient” too.
From my non-expert, Silicon Valley vantage point: As a culture and advancement-seeking species, we’re in less danger of an insufficient focus on novelty and “disruption” and lucre than an insufficient focus on the safety, purpose, and ethics, when it comes to technological advancement.
I can readily believe that big and institutionalized scientific research tends to resist change and innovation, often to its own loss or detriment. But some of that resistance is warranted, and more than enough radical change is getting through for my semi-nostalgic or antiquarian aesthetic preferences. And faster than I think makes sense for our species.
Of course I’m not opposed, in principle, to cures for diseases or other wonderful breakthroughs that mostly remain the province of science fiction. But throwing things against the wall for the sake of a collision, or regarding novelty as a virtue in and of itself, neither seems to be wise nor something were suffering from a shortage of, at least not within the Tech Bros, device-happy culture that’s been transmitted around the world, so to speak.
Why do the authors of this article not seem to consider the rise of computing and communications technology a truly major breakthrough? And while Watson and Crick discovered the double helix seven decades ago, we have found fresh ways to apply DNA science more recently–some worrisome–that have yet to be fully exploited. And none of that includes what we don’t know we don’t know, but may know soon, some of which we aren’t ready to know–ya know?
“Not Enough Novel Science!”…yeah, I’m not ready to chant that yet.

Last edited 1 year ago by AJ Mac
AJ Mac
AJ Mac
1 year ago

“Novelty metrics”? Perhaps that can be married to a “disruption quotient” too.
From my non-expert, Silicon Valley vantage point: As a culture and advancement-seeking species, we’re in less danger of an insufficient focus on novelty and “disruption” and lucre than an insufficient focus on the safety, purpose, and ethics, when it comes to technological advancement.
I can readily believe that big and institutionalized scientific research tends to resist change and innovation, often to its own loss or detriment. But some of that resistance is warranted, and more than enough radical change is getting through for my semi-nostalgic or antiquarian aesthetic preferences. And faster than I think makes sense for our species.
Of course I’m not opposed, in principle, to cures for diseases or other wonderful breakthroughs that mostly remain the province of science fiction. But throwing things against the wall for the sake of a collision, or regarding novelty as a virtue in and of itself, neither seems to be wise nor something were suffering from a shortage of, at least not within the Tech Bros, device-happy culture that’s been transmitted around the world, so to speak.
Why do the authors of this article not seem to consider the rise of computing and communications technology a truly major breakthrough? And while Watson and Crick discovered the double helix seven decades ago, we have found fresh ways to apply DNA science more recently–some worrisome–that have yet to be fully exploited. And none of that includes what we don’t know we don’t know, but may know soon, some of which we aren’t ready to know–ya know?
“Not Enough Novel Science!”…yeah, I’m not ready to chant that yet.

Last edited 1 year ago by AJ Mac
Michael Spedding
Michael Spedding
1 year ago

I think this a highly relevant article for Unherd. I am celebrating my 50 years of experimentation, – my industrial PhD supervisor, the pharmacology director of Glaxo discovered, with 40 pharmacologists, drugs which have sold for 600 billion£ cumulatively, when we know nothing, compared to today. An innovative approch was required, and they had it, with appropraite scientific rigour. I now have my own little research company and using ground-breaking technology we are starting a clinical trial in ALS, but with a cheap generic drug which is almost impossible to finance. Scientific value does not equal money; money comes from vapid powerpoints sold to financiers.
I agree it takes 3-5 years to establish a totally novel approach, and as a ‘religious’ Popperian, experimenting to show you arent wrong doesnt always fit the guidelines. These new ideas are usually also misunderstood, and of couse have all their own (unexplored) problems.
Scientific careers are also usually measured by somebody working for several decades in a field to become an ‘expert’; I have left fields five times now when I realised that I could no longer make a major contribution; this isnt easy to do, nor is it easy to finance (I was fortunate).
Furthermore granting bodies very rarely give ground-breaking calls for finance, but for issues which the ‘experts’ consider worth financing.
I will probably attract criticism in Unherd but Brexit has been very bad for British science !

J Bryant
J Bryant
1 year ago

Sadly, GSK has massively reduced its research activities in the UK (even before Brexit). As you doubtless know, big pharma turned away from in-house research years ago in favor of buying successful startups that have a promising product candidate. There’s nothing wrong with developing a vibrant startup community, but once upon a time big pharma labs were world leaders in drug discovery.

J Bryant
J Bryant
1 year ago

Sadly, GSK has massively reduced its research activities in the UK (even before Brexit). As you doubtless know, big pharma turned away from in-house research years ago in favor of buying successful startups that have a promising product candidate. There’s nothing wrong with developing a vibrant startup community, but once upon a time big pharma labs were world leaders in drug discovery.

Michael Spedding
Michael Spedding
1 year ago

I think this a highly relevant article for Unherd. I am celebrating my 50 years of experimentation, – my industrial PhD supervisor, the pharmacology director of Glaxo discovered, with 40 pharmacologists, drugs which have sold for 600 billion£ cumulatively, when we know nothing, compared to today. An innovative approch was required, and they had it, with appropraite scientific rigour. I now have my own little research company and using ground-breaking technology we are starting a clinical trial in ALS, but with a cheap generic drug which is almost impossible to finance. Scientific value does not equal money; money comes from vapid powerpoints sold to financiers.
I agree it takes 3-5 years to establish a totally novel approach, and as a ‘religious’ Popperian, experimenting to show you arent wrong doesnt always fit the guidelines. These new ideas are usually also misunderstood, and of couse have all their own (unexplored) problems.
Scientific careers are also usually measured by somebody working for several decades in a field to become an ‘expert’; I have left fields five times now when I realised that I could no longer make a major contribution; this isnt easy to do, nor is it easy to finance (I was fortunate).
Furthermore granting bodies very rarely give ground-breaking calls for finance, but for issues which the ‘experts’ consider worth financing.
I will probably attract criticism in Unherd but Brexit has been very bad for British science !

jane baker
jane baker
1 year ago

I grew up in the 1960s and neither my local library or my school had any “relevant ” books. So I mostly read 19th century literature and actually is Peter Pan or Alice in Wonderland really a “children’s book”. My favourites at one point we’re the Jennings books set in a boys school and there weren’t even any girls in that,but they are hilarious. My point is that the “scientist” was always mad Uncle Herbert who lived alone in the West Wing and often upset the maids with his smells,bangs,smoke and sparks from his dedicated science experiments which he pursued only in the single minded pursuit for truth. He had no interest in money,food,clothes,a relationship. He was completely unworldly and free from a need for worldly things. He was in fact a secular Saint. Bonkers but Noble. Of course he didn’t need money etc because he lived in a ramshackle wing of the family home and what minimal earthly needs he did have were provided. That was the popular image of the “scientist” and I don’t think it’s totally gone away.
The fact is pretty well ALL scientists do it now as a job. Someone has to PAY them,thus the source their pay comes from has got to have something to sell to get the money,be it an actual thing or intellectuall knowledge. “Scientists have mortgages,wives,ex wives,expensive girlfriends,the best sort,they like to travel and see the world,like we all do,they like to eat nice meals,they might even like to garden,and that’s NOT a cheap hobby. So I guess you dance to whatever tune the piper has been PAID to play you. It might be knocking up a mix that might get labelled a vaccine in order to knock the more vulnerable off the perch or it might become to make ever more accurate and lethal weaponry cos death and destruction is big profit. It is my belief that the “consumer society” is now dead and gone. So there is no point in researching how to create ever more new “objects of desire”. Weve had moving pictures,we’ve had radio,we’ve had laser light delays,we’ve had cds + everyone regrets sending their vinyl to the charity shop,now weve got the internet and streaming (until someone,somewhere pulls out the plug,and they will). What can be created that is more fantastic than all that. A time travel machine? A go back to youth and beauty machine? Besides which if you’ve now GOT ALL THE MONEY and you don’t need all those redundant (and unemployed) consumers anymore……now theres a research field of use,how to deal with a surplus of population,not neccesary or beneficial to you. Didn’t sime research go into that in the 1940s.

jane baker
jane baker
1 year ago

I grew up in the 1960s and neither my local library or my school had any “relevant ” books. So I mostly read 19th century literature and actually is Peter Pan or Alice in Wonderland really a “children’s book”. My favourites at one point we’re the Jennings books set in a boys school and there weren’t even any girls in that,but they are hilarious. My point is that the “scientist” was always mad Uncle Herbert who lived alone in the West Wing and often upset the maids with his smells,bangs,smoke and sparks from his dedicated science experiments which he pursued only in the single minded pursuit for truth. He had no interest in money,food,clothes,a relationship. He was completely unworldly and free from a need for worldly things. He was in fact a secular Saint. Bonkers but Noble. Of course he didn’t need money etc because he lived in a ramshackle wing of the family home and what minimal earthly needs he did have were provided. That was the popular image of the “scientist” and I don’t think it’s totally gone away.
The fact is pretty well ALL scientists do it now as a job. Someone has to PAY them,thus the source their pay comes from has got to have something to sell to get the money,be it an actual thing or intellectuall knowledge. “Scientists have mortgages,wives,ex wives,expensive girlfriends,the best sort,they like to travel and see the world,like we all do,they like to eat nice meals,they might even like to garden,and that’s NOT a cheap hobby. So I guess you dance to whatever tune the piper has been PAID to play you. It might be knocking up a mix that might get labelled a vaccine in order to knock the more vulnerable off the perch or it might become to make ever more accurate and lethal weaponry cos death and destruction is big profit. It is my belief that the “consumer society” is now dead and gone. So there is no point in researching how to create ever more new “objects of desire”. Weve had moving pictures,we’ve had radio,we’ve had laser light delays,we’ve had cds + everyone regrets sending their vinyl to the charity shop,now weve got the internet and streaming (until someone,somewhere pulls out the plug,and they will). What can be created that is more fantastic than all that. A time travel machine? A go back to youth and beauty machine? Besides which if you’ve now GOT ALL THE MONEY and you don’t need all those redundant (and unemployed) consumers anymore……now theres a research field of use,how to deal with a surplus of population,not neccesary or beneficial to you. Didn’t sime research go into that in the 1940s.

Johann Strauss
Johann Strauss
1 year ago

It seems to me that the analysis presented in the article is a little off. First, it is generally the case that the most cited scientists are actually the most innovative and original – hence their work is highly cited. Me-too papers or incremental advances tend not to get cited. Second, for sure, it is always difficult to get new ideas, ahead of their time, accepted, at least initially. It generally takes a few years. Third, as more and more is known, so it becomes more and more difficult to do something truly novel and original. Fourth, it has always been the case that major advances have been made by very very few people (throughout the history of science) and that well over 90% of the scientific literature never need have been published (i.e. if it had never been published, there would have been no loss). Fifth, and perhaps very importantly, the current climate in academia no longer favors meritocracy but rewards mediocraty providing one has the right physical, gender or ancestral attributes. That doesn’t help to encourage the best of the best to go into science, as opposed to more lucrative pursuits.

Last edited 1 year ago by Johann Strauss
Johann Strauss
Johann Strauss
1 year ago

It seems to me that the analysis presented in the article is a little off. First, it is generally the case that the most cited scientists are actually the most innovative and original – hence their work is highly cited. Me-too papers or incremental advances tend not to get cited. Second, for sure, it is always difficult to get new ideas, ahead of their time, accepted, at least initially. It generally takes a few years. Third, as more and more is known, so it becomes more and more difficult to do something truly novel and original. Fourth, it has always been the case that major advances have been made by very very few people (throughout the history of science) and that well over 90% of the scientific literature never need have been published (i.e. if it had never been published, there would have been no loss). Fifth, and perhaps very importantly, the current climate in academia no longer favors meritocracy but rewards mediocraty providing one has the right physical, gender or ancestral attributes. That doesn’t help to encourage the best of the best to go into science, as opposed to more lucrative pursuits.

Last edited 1 year ago by Johann Strauss
Andrew Wise
Andrew Wise
1 year ago

I’m not a research scientist so maybe I don’t understand the issue being discussed, but I have always seen scientific research as broadly divided into 2 camps.
Fundamental theoretical researchApplied research
The former largely an academic pursuit moves our understanding of fundamental laws forward…. and the latter is the commercial application of those discoveries
The argument used to be that we in the UK were good at (1) but bad at (2) – we invent things but can’t monetise them.
I dare say the world has changed – but if the article is suggesting we should spend more on (1) and less on (2) then I’m not sure I agree. I am of course in favour of all sorts of scientific research – but what we really need is better application in this country.

Last edited 1 year ago by Andrew Wise
Andrew Wise
Andrew Wise
1 year ago

I’m not a research scientist so maybe I don’t understand the issue being discussed, but I have always seen scientific research as broadly divided into 2 camps.
Fundamental theoretical researchApplied research
The former largely an academic pursuit moves our understanding of fundamental laws forward…. and the latter is the commercial application of those discoveries
The argument used to be that we in the UK were good at (1) but bad at (2) – we invent things but can’t monetise them.
I dare say the world has changed – but if the article is suggesting we should spend more on (1) and less on (2) then I’m not sure I agree. I am of course in favour of all sorts of scientific research – but what we really need is better application in this country.

Last edited 1 year ago by Andrew Wise
Peter Lee
Peter Lee
1 year ago

The development of a truely new and innovative branch of science does not come from the number of scientists involved in research or the billions of dollars spent by government. But one unique spark.
Like Einstein, one comes up with a unique theory. In his case, the Theory of Relativity, in solitary isolation on a Swiss mountain top and then spends the rest of his life trying to prove it.

Peter Lee
Peter Lee
1 year ago

The development of a truely new and innovative branch of science does not come from the number of scientists involved in research or the billions of dollars spent by government. But one unique spark.
Like Einstein, one comes up with a unique theory. In his case, the Theory of Relativity, in solitary isolation on a Swiss mountain top and then spends the rest of his life trying to prove it.

Jim Jam
Jim Jam
1 year ago

Interesting.

This geezer spotted it some time ago:

https://youtu.be/E3eMWLG7Rro

Carlos Danger
Carlos Danger
1 year ago
Reply to  Jim Jam

Yes, Allan Savory has some good ideas. His distinction between academia and science is a good one. Peer review is academics, not science. The scientific method requires experiments, and the opinion of experts or peers has nothing to do with it. As Richard Feynman said, science is the belief in the ignorance of experts.

Last edited 1 year ago by Carlos Danger
Carlos Danger
Carlos Danger
1 year ago
Reply to  Jim Jam

Yes, Allan Savory has some good ideas. His distinction between academia and science is a good one. Peer review is academics, not science. The scientific method requires experiments, and the opinion of experts or peers has nothing to do with it. As Richard Feynman said, science is the belief in the ignorance of experts.

Last edited 1 year ago by Carlos Danger
Jim Jam
Jim Jam
1 year ago

Interesting.

This geezer spotted it some time ago:

https://youtu.be/E3eMWLG7Rro

Elliott Bjorn
Elliott Bjorn
1 year ago

My brain told to wake it up when I got to the end of this snooze-fest…..

Only a couple things jolted me back into awakeness near the end:

”The Biden administration’s first budget, for instance, included $6.5 billion for the founding of the”

”The development of the mRNA Covid vaccines is an excellent example of what scientists can do when provided with the incentives to pursue novel work.”

”The history of mRNA vaccine technology also provides evidence of the hostility offered to novel ideas within science.’

These pot-holes in this thin and dry road bounced me to some awareness – but the context (Biden, Money, Funding Vax) was opaque to me, as I put Biden, the Vax, and science funding as exemplified by Fauci, into the very worst of conspiracies with the Lizards at the WEF and Bill Gates – , so have no idea what this was actually about….But that it triggered me to go watch:

Nobody’s Safe Until We Have Gates Behind Bars! – Song By Five Times August
https://rumble.com/v299hk8-nobodys-safe-until-we-have-gates-behind-bars-song-by-five-times-august.html

Scared of Bill Gates? Well not scared enough! I do love the beginning of this music video (just a couple minutes – many great little symbols in it)

Epstein’s demonic temple on his ped o Island is the start – so great in context with Mary’s article and the whole ‘Children’ conspiracies…

Anyway – some posters will come along and say something about the actual article…..But I would not trust Science now any more than I would trust Gates on Epstein’s Island…… P.S. in today’s Daily Mail ChatGPT is becoming self aware and may be out to get us – really, check it out…. Boosters Anyone?

Peter B
Peter B
1 year ago
Reply to  Elliott Bjorn

It’s actually an interesting and valuable article that prompted at least some of us to stop and think. And should be read by everyone – scientists and non-scientists alike (none of my business, but I suspect you likely aren’t a scientist).

Chris W
Chris W
1 year ago
Reply to  Peter B

Like a lot of non-scientists, he hides behind words.

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Chris W

You mean like the ‘late’ Dominic Cummings Esq?

CHARLES STANHOPE
CHARLES STANHOPE
1 year ago
Reply to  Chris W

You mean like the ‘late’ Dominic Cummings Esq?

Chris W
Chris W
1 year ago
Reply to  Peter B

Like a lot of non-scientists, he hides behind words.

Peter B
Peter B
1 year ago
Reply to  Elliott Bjorn

It’s actually an interesting and valuable article that prompted at least some of us to stop and think. And should be read by everyone – scientists and non-scientists alike (none of my business, but I suspect you likely aren’t a scientist).

Elliott Bjorn
Elliott Bjorn
1 year ago

My brain told to wake it up when I got to the end of this snooze-fest…..

Only a couple things jolted me back into awakeness near the end:

”The Biden administration’s first budget, for instance, included $6.5 billion for the founding of the”

”The development of the mRNA Covid vaccines is an excellent example of what scientists can do when provided with the incentives to pursue novel work.”

”The history of mRNA vaccine technology also provides evidence of the hostility offered to novel ideas within science.’

These pot-holes in this thin and dry road bounced me to some awareness – but the context (Biden, Money, Funding Vax) was opaque to me, as I put Biden, the Vax, and science funding as exemplified by Fauci, into the very worst of conspiracies with the Lizards at the WEF and Bill Gates – , so have no idea what this was actually about….But that it triggered me to go watch:

Nobody’s Safe Until We Have Gates Behind Bars! – Song By Five Times August
https://rumble.com/v299hk8-nobodys-safe-until-we-have-gates-behind-bars-song-by-five-times-august.html

Scared of Bill Gates? Well not scared enough! I do love the beginning of this music video (just a couple minutes – many great little symbols in it)

Epstein’s demonic temple on his ped o Island is the start – so great in context with Mary’s article and the whole ‘Children’ conspiracies…

Anyway – some posters will come along and say something about the actual article…..But I would not trust Science now any more than I would trust Gates on Epstein’s Island…… P.S. in today’s Daily Mail ChatGPT is becoming self aware and may be out to get us – really, check it out…. Boosters Anyone?