A casual consumer of scientific journalism could be forgiven for thinking that we are living in a golden age of research. Systematic evidence, however, suggests otherwise. Breakthroughs comparable to the discovery of DNA — only 70 years ago — have been all too rare in recent decades, despite massive increases in investment. Scientific work is now less likely to go in new directions, and funding agencies are less likely to bankroll more exploratory projects. Even in those areas where scientific progress is still robust, making discoveries still takes a lot more effort than it did in the past. The cost of developing new drugs, for example, now doubles every nine years.
Experts disagree on what has been holding science back. A common explanation is that potential discoveries are fewer and harder to find, absolving scientists, and institutions, from responsibility. In reality, similar complaints have been made in nearly every era, for example by late 19th-century physicists on the brink of discovering relativity. And such explanations can be self-fulfilling: it’s harder to get funding for ambitious exploratory work deemed infeasible by your peers.
To understand the slower pace of discovery, it is crucial to understand the process by which scientific breakthroughs happen. It can be illustrated by a surprisingly simple three-phase model. First, in the exploration phase, if a new scientific idea attracts the attention of enough scientists, they learn some of its key properties. Second, in the breakthrough phase, scientists learn how to utilise those key properties fruitfully in their work. Third, in the final phase, as the idea matures, advances are incremental. It still generates useful insights, but the most important ones have been exhausted; much of the work in this phase focuses on the idea’s practical applications.
Scientists are quite willing to work on ideas during the breakthrough phase — after all, everyone wants in on a project with good prospects. They are also willing to work on mature ideas, to reap the social benefits of successful ideas. But working on novel ideas exposes a scientist’s career to considerable risk, because most of them fail. This bias against exploratory science is a critical driver of the field’s stagnation, because the greatest risks often come with the greatest rewards. For example, researchers who sought to first edit genes in mammalian cells in 2011 considered CRISPR technology a risky choice, because the technique was still in many ways undeveloped. Today, by contrast, it is one of the most celebrated advances in biomedicine.
The graph below shows the development of four hypothetical ideas — A, B, C and D — through the three stages of this model. Given sustained scientific effort in the exploration phase, ideas A and B will develop into important advances; idea A’s S-curve is steeper in the breakthrough phase, meaning it is of the most significance to the broader scientific community. By contrast, ideas C and D will never amount to much, no matter how much effort is expended on them. The problem for scientists is that, in the exploration phase, the potential impact of all four ideas could appear nearly identical.
This bias against exploratory science points to a critical driver of scientific stagnation: scientists are frequently reluctant to spend their time exploring new ideas and have increasingly turned their attention to incremental science. This is backed up by quantitative evidence. University of Chicago biologist Andrey Rzhetsky and his colleagues found: “The typical research strategy used to explore chemical relationships in biomedicine… generates conservative research choices focused on building up knowledge around important molecules. These choices [have] become more conservative over time.” Another paper by the same team (led this time by UCLA sociologist Jacob Foster) also reports: “High-risk innovation strategies are rare and reflect a growing focus on established knowledge.” Meanwhile, a recent analysis by University of Arizona sociologist Russell Funk and his colleagues tracks a “marked decline in disruptive science and technology over time”, and attributes this trend to scientists relying on a narrowing set of existing knowledge.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeI did biomedical research for a living almost thirty years ago. In those days, how often you appeared in the science citation index was certainly important, but publish or perish was the prevailing philosophy.
What the authors didn’t directly mention is the nature of grant funding which, so far as I know, hasn’t changed much over the past three decades. To be funded by the major granting bodies, notably NIH, you had to be established in the relevant subspecialty of research and write a proposal that relied closely on previous research and that was almost certain to yield some sort of results. In other words, especially if you were a junior faculty member, your proposal should be directed to the next obvious experiments in an established field.
There were a few sources of funding that targeted truly novel research. One was the Gates Foundation in its original iteration. At the beginning, Bill Gates and his father personally reviewed some proposals and simply wrote a check if they liked a proposal (it’s good to be a billionaire!).
I have immense respect for Prof. Bhattacharya, but I respectfully disagree with his, and his co-author’s, characterization of the mRNA vaccines as novel or breakthrough discoveries. The lipid particle technology used to deliver the mRNA is well-established and Moderna is currently being sued for patent infringement relating to that technology. Similarly, the mRNA component relies on high-purification of the synthetic mRNA and chemical modification of the mRNA to reduce its immunogenicity (immune response against the mRNA itself). Those technologies are also not new. The production of the mRNA vaccines in so short a time, and at such scale, was doubtless an impressive feat of commercial production and distribution, but it didn’t constitute Nobel-quality fundamental research, imo.
Thank you for this. Is your, or indeed the authors’, sense also that any proposal for research which might overturn or challenge a scientific “consensus” in which there is a large vested commercial interest is much less likely to receive funding than one which might confirm or elaborate on it? If this were true, it would also result in less innovation overall.
I doubt it, frankly. Sure, if you are working on something that would harm a large commercial interest, some funding pools would be closed for you. But most university funding is public money, and disruptive innovation is quite popular with entrepreneurial types these days. But what the article is talking about is early days, completely novel ideas, and they are too far away and too unpredictable for big corporations to waste their energy on trying to suppress them.
The article is closer to the mark. And there is an adidtional point. If you run a research group (it takes a group) you need to build up and maintain expertise, experienced co-workers, expensive kit, and connections, and that is on the order of a decade or more. But your funding come mostly in three-year grants. So you need each fund application to produce clear results, in order to be able to get the next one. Smart groups apply for grants promising to deliver results that are already 80% complete (but unpublished) and use those three years to develop the results they are going to promise in the next round. That only works in areas where there is a reliable pipeline of regular results. If you are working on a completely new idea that will take a decade to deliver – and quite possibly will not deliver at all – you need someone to fund you on spec for the whole decade. And that is hard to get.
My grandfather was born in 1903, before the the Wright brothers’ first successful flight and when homes were still lit with oil lamps.
His lifetime (and mostly the first 60 years) saw the development of the telephone system, domestic electricity (and gas), radio, passenger air travel, television, the jet engine, atomic power, space flight, the computer, antibiotics, man landing on the moon, the internet and the mobile phone.
In the last 35 years have we seen any developments equal to these achievement? If so, I cannot bring them to mind.
It seems to me that we have taken all the low hanging fruit and that we are now well in to the land of diminishing returns. You can never discount the possibility of a lone genius seeing something that no one else has seen or would ever likely see (Dirac for example) bur today’s science seems to demand huge amounts of resource for increasingly incremental gain.
Most of those developments you speak were greatly spurred on by War….”The Father of all things “.
Most of those developments you speak were greatly spurred on by War….”The Father of all things “.
I would agree with Rasmus Fogh that it’s probably unlikely university research will be suppressed because it challenges a major commercial interest. University researchers are, however, strongly encouraged to form alliances with industry and obtain funding for “applied” projects. I can certainly believe it will be more difficult to obtain such funding for research that’s likely to undermine commercial interests.
I doubt it, frankly. Sure, if you are working on something that would harm a large commercial interest, some funding pools would be closed for you. But most university funding is public money, and disruptive innovation is quite popular with entrepreneurial types these days. But what the article is talking about is early days, completely novel ideas, and they are too far away and too unpredictable for big corporations to waste their energy on trying to suppress them.
The article is closer to the mark. And there is an adidtional point. If you run a research group (it takes a group) you need to build up and maintain expertise, experienced co-workers, expensive kit, and connections, and that is on the order of a decade or more. But your funding come mostly in three-year grants. So you need each fund application to produce clear results, in order to be able to get the next one. Smart groups apply for grants promising to deliver results that are already 80% complete (but unpublished) and use those three years to develop the results they are going to promise in the next round. That only works in areas where there is a reliable pipeline of regular results. If you are working on a completely new idea that will take a decade to deliver – and quite possibly will not deliver at all – you need someone to fund you on spec for the whole decade. And that is hard to get.
My grandfather was born in 1903, before the the Wright brothers’ first successful flight and when homes were still lit with oil lamps.
His lifetime (and mostly the first 60 years) saw the development of the telephone system, domestic electricity (and gas), radio, passenger air travel, television, the jet engine, atomic power, space flight, the computer, antibiotics, man landing on the moon, the internet and the mobile phone.
In the last 35 years have we seen any developments equal to these achievement? If so, I cannot bring them to mind.
It seems to me that we have taken all the low hanging fruit and that we are now well in to the land of diminishing returns. You can never discount the possibility of a lone genius seeing something that no one else has seen or would ever likely see (Dirac for example) bur today’s science seems to demand huge amounts of resource for increasingly incremental gain.
I would agree with Rasmus Fogh that it’s probably unlikely university research will be suppressed because it challenges a major commercial interest. University researchers are, however, strongly encouraged to form alliances with industry and obtain funding for “applied” projects. I can certainly believe it will be more difficult to obtain such funding for research that’s likely to undermine commercial interests.
I’d even say Covid-19 vaccines arose from the commercial failure of technologies originally worked on for gene therapy. Vaccines did not require the high transfection efficiencies demanded by gene therapy and proved to be a more valuable and achievable secondary target.
P.S.: I should note that term “gene therapy” as used by scientists is not restricted to permanent modification of DNA, as claimed by Full Fact when rebutting Andrew Bridgen. Bridgen made a false claim concerning mRNA vaccines but mRNA vaccines do come under gene therapy, just not for the purpose of modifying DNA. Just search Google Scholar with “mRNA gene therapy” to see it is so.
Thank you for this. Is your, or indeed the authors’, sense also that any proposal for research which might overturn or challenge a scientific “consensus” in which there is a large vested commercial interest is much less likely to receive funding than one which might confirm or elaborate on it? If this were true, it would also result in less innovation overall.
I’d even say Covid-19 vaccines arose from the commercial failure of technologies originally worked on for gene therapy. Vaccines did not require the high transfection efficiencies demanded by gene therapy and proved to be a more valuable and achievable secondary target.
P.S.: I should note that term “gene therapy” as used by scientists is not restricted to permanent modification of DNA, as claimed by Full Fact when rebutting Andrew Bridgen. Bridgen made a false claim concerning mRNA vaccines but mRNA vaccines do come under gene therapy, just not for the purpose of modifying DNA. Just search Google Scholar with “mRNA gene therapy” to see it is so.
I did biomedical research for a living almost thirty years ago. In those days, how often you appeared in the science citation index was certainly important, but publish or perish was the prevailing philosophy.
What the authors didn’t directly mention is the nature of grant funding which, so far as I know, hasn’t changed much over the past three decades. To be funded by the major granting bodies, notably NIH, you had to be established in the relevant subspecialty of research and write a proposal that relied closely on previous research and that was almost certain to yield some sort of results. In other words, especially if you were a junior faculty member, your proposal should be directed to the next obvious experiments in an established field.
There were a few sources of funding that targeted truly novel research. One was the Gates Foundation in its original iteration. At the beginning, Bill Gates and his father personally reviewed some proposals and simply wrote a check if they liked a proposal (it’s good to be a billionaire!).
I have immense respect for Prof. Bhattacharya, but I respectfully disagree with his, and his co-author’s, characterization of the mRNA vaccines as novel or breakthrough discoveries. The lipid particle technology used to deliver the mRNA is well-established and Moderna is currently being sued for patent infringement relating to that technology. Similarly, the mRNA component relies on high-purification of the synthetic mRNA and chemical modification of the mRNA to reduce its immunogenicity (immune response against the mRNA itself). Those technologies are also not new. The production of the mRNA vaccines in so short a time, and at such scale, was doubtless an impressive feat of commercial production and distribution, but it didn’t constitute Nobel-quality fundamental research, imo.
This article really strikes a chord with me.
Once upon a time I worked in research biophysics. I really loved the PhD I did because I was left alone and allowed to explore an avenue that was interesting to me. But my experience was the exception rather than the rule and, I think, based on having a very kind and decent supervisor who protected me from the indignities of academia.
But once I finished and went into the postdoc cycle I had a rude awakening for all the reasons outlined above. But there were yet other reasons, not covered in this article:
First, I worked in a small research group in a narrow area of exploration where there were only 2-3 other labs in the World doing the same work. This meant that our scientific competitors were also our peer reviewers. This was not a good situation for obvious reasons.
Second, salaries were very low compared to industry, often 1/3rd or less compared to tech or finance. Add to that that most postdocs were funded, to save money, as “student scholarships”. This together with the contract cycle being in the region of 2-3 years maximum, made it impossible to get a mortgage. This was one reason that, when I left science, it was well on its way to becoming a vocation for those with old money, rather than one for those most passionate about discovery.
Finally, I saw that academia was becoming more and more dominated by “goodie-goodies” motivated by Noddy badges (publication count), over and above interest in discovery itself. Watson and Crick hadn’t completed their PhDs at the time that they did their groundbreaking work – if they had done their PhDs when I did, they would have had to cut their research short after 3 years, write up a “lessons learned” piece, and likely not find a postdoc because they’d have not published anything. How many creative types capable of Nobel Prize work have been screened out in our quest for goodie-goodies whose minds are optimised for social recognition points based on low-creativity publication counts?
A minor correction: Watson had his PhD in hand when he went to the Cavendish for his postdoc. It was Crick who hadn’t yet got his PhD. But Crick was older owing to WWII. Further, Crick’s PhD work had nothing to do with the DNA double helix, although it was very helpful to figure out what was going on and to interpret Rosalind Franklin’s fiber diffraction data (which incidentally they failed to acknowledge or mention in the classic Nature paper but rather referred to Astbury’s data from the 30s which was impossible to interpret as it was obtained from a mix of B and A DNA fibers.
Good insights, Hayden! I wonder to what extent the rapid onset of the woke worldview has submerged scientific endeavour in the same glutinous mire that has choked originality, risk-taking and free expression in all other dimensions of human existence? I suspect the answer is: “Yes!”
A minor correction: Watson had his PhD in hand when he went to the Cavendish for his postdoc. It was Crick who hadn’t yet got his PhD. But Crick was older owing to WWII. Further, Crick’s PhD work had nothing to do with the DNA double helix, although it was very helpful to figure out what was going on and to interpret Rosalind Franklin’s fiber diffraction data (which incidentally they failed to acknowledge or mention in the classic Nature paper but rather referred to Astbury’s data from the 30s which was impossible to interpret as it was obtained from a mix of B and A DNA fibers.
Good insights, Hayden! I wonder to what extent the rapid onset of the woke worldview has submerged scientific endeavour in the same glutinous mire that has choked originality, risk-taking and free expression in all other dimensions of human existence? I suspect the answer is: “Yes!”
This article really strikes a chord with me.
Once upon a time I worked in research biophysics. I really loved the PhD I did because I was left alone and allowed to explore an avenue that was interesting to me. But my experience was the exception rather than the rule and, I think, based on having a very kind and decent supervisor who protected me from the indignities of academia.
But once I finished and went into the postdoc cycle I had a rude awakening for all the reasons outlined above. But there were yet other reasons, not covered in this article:
First, I worked in a small research group in a narrow area of exploration where there were only 2-3 other labs in the World doing the same work. This meant that our scientific competitors were also our peer reviewers. This was not a good situation for obvious reasons.
Second, salaries were very low compared to industry, often 1/3rd or less compared to tech or finance. Add to that that most postdocs were funded, to save money, as “student scholarships”. This together with the contract cycle being in the region of 2-3 years maximum, made it impossible to get a mortgage. This was one reason that, when I left science, it was well on its way to becoming a vocation for those with old money, rather than one for those most passionate about discovery.
Finally, I saw that academia was becoming more and more dominated by “goodie-goodies” motivated by Noddy badges (publication count), over and above interest in discovery itself. Watson and Crick hadn’t completed their PhDs at the time that they did their groundbreaking work – if they had done their PhDs when I did, they would have had to cut their research short after 3 years, write up a “lessons learned” piece, and likely not find a postdoc because they’d have not published anything. How many creative types capable of Nobel Prize work have been screened out in our quest for goodie-goodies whose minds are optimised for social recognition points based on low-creativity publication counts?
Three points:
1) The Eureka moment is not something that just happens. It comes after years of immersion in a problem. Today those years would be very costly – more likely that the scientist would be moved on to pastures new.
2) Times have changed and commercial pressures are different. If a scientist discovered a new battery for cellphones which prolonged their life indefinitely, the manufacurers would suppress the development because planned obsolesence is a critical part of manufacturing.
3) Confusing the issue and the graphs we have a whole new type of scientist, one who calls himself a Social Scientist. These are people who have no scientific training but they still publish papers as pseudo-scientists. Their only skill, in fact, is to manipulate statistics. For every paper announcing a true scientific discovery there are probably ten later ‘vulture’ papers by Social Scientists who just move the figures around. This gives science a bad name and can be a disincentive for real scientists.
1) very true.
2) The article is talking about new ideas – the kind that would eventually make it possible to build a better battery. Manufacturers do not supress at the level of basic science – neither IBM nor KODAK nor the Swiss watch industry tried to suppress the idea of the microchip.
You have to choose the right example to make the point.
New types of packaging, those which are better for the environment, are suppressed by supermarkets because they don’t want shorter storage windows.
My friend worked for a company which developed a process for extending the life of re-inforced concrete, thereby saving the world millions of dollars. This was suppressed because the company was taken over by another which had its own ‘product’.
People have come up with discoveries which will remove carbon from the atmosphere, meaning that we don’t have to suffer with the ‘No Oil’ brigade. These were hushed up because (politically) we don’t want to be reliant on Russia and the Middle-East.
Exactly. You are talking about product development, invention, which is well in the realm of ‘incremental development’. You certainly have a point, there, but the article is talking about basic science, exploration, which is far upstream from actual products.
Exactly. You are talking about product development, invention, which is well in the realm of ‘incremental development’. You certainly have a point, there, but the article is talking about basic science, exploration, which is far upstream from actual products.
You have to choose the right example to make the point.
New types of packaging, those which are better for the environment, are suppressed by supermarkets because they don’t want shorter storage windows.
My friend worked for a company which developed a process for extending the life of re-inforced concrete, thereby saving the world millions of dollars. This was suppressed because the company was taken over by another which had its own ‘product’.
People have come up with discoveries which will remove carbon from the atmosphere, meaning that we don’t have to suffer with the ‘No Oil’ brigade. These were hushed up because (politically) we don’t want to be reliant on Russia and the Middle-East.
Excellent point. If the research is focused on finding an innovative algorithm to keep teenagers more glued to their phones or “discovering” the perfect b***r pill, we’re not exactly in pure exploration and discovery mode.
The suppression of true advances and breakthroughs in medicine, for example, is an actual, real-impact problem. And some people, even scientists with a very high IQ that is joined to a working conscience, will have a timid, conventional, or institutionally-shackled approach. No matter how much energy and money is supplied to counteract the presence of too much greed and misdirected energy.
But why do we always seem to be swallowing the spider to catch the fly, instead of controlling the fly infestations we’ve already unleashed?
We need a brave new harm-reduction incrementalism! Less Worthless Science Now ! (Make a t-shirt outta that).
I’m pretty sure the end products would suck less or present less damaging implications, as a rule, if the focus were less commercial and disruption happy. You wanna change the world for the better? Then don’t publish your trivial or nihilistic research in the name of Pure or Open Science. And reject ludicrous wealth or personal fame as primary motives. (Please!).
{curious editorial logic that permits “bullshit” to print but disallows disallows a commonplace reference to “male enhancement”}
1) very true.
2) The article is talking about new ideas – the kind that would eventually make it possible to build a better battery. Manufacturers do not supress at the level of basic science – neither IBM nor KODAK nor the Swiss watch industry tried to suppress the idea of the microchip.
Excellent point. If the research is focused on finding an innovative algorithm to keep teenagers more glued to their phones or “discovering” the perfect b***r pill, we’re not exactly in pure exploration and discovery mode.
The suppression of true advances and breakthroughs in medicine, for example, is an actual, real-impact problem. And some people, even scientists with a very high IQ that is joined to a working conscience, will have a timid, conventional, or institutionally-shackled approach. No matter how much energy and money is supplied to counteract the presence of too much greed and misdirected energy.
But why do we always seem to be swallowing the spider to catch the fly, instead of controlling the fly infestations we’ve already unleashed?
We need a brave new harm-reduction incrementalism! Less Worthless Science Now ! (Make a t-shirt outta that).
I’m pretty sure the end products would suck less or present less damaging implications, as a rule, if the focus were less commercial and disruption happy. You wanna change the world for the better? Then don’t publish your trivial or nihilistic research in the name of Pure or Open Science. And reject ludicrous wealth or personal fame as primary motives. (Please!).
{curious editorial logic that permits “bullshit” to print but disallows disallows a commonplace reference to “male enhancement”}
Three points:
1) The Eureka moment is not something that just happens. It comes after years of immersion in a problem. Today those years would be very costly – more likely that the scientist would be moved on to pastures new.
2) Times have changed and commercial pressures are different. If a scientist discovered a new battery for cellphones which prolonged their life indefinitely, the manufacurers would suppress the development because planned obsolesence is a critical part of manufacturing.
3) Confusing the issue and the graphs we have a whole new type of scientist, one who calls himself a Social Scientist. These are people who have no scientific training but they still publish papers as pseudo-scientists. Their only skill, in fact, is to manipulate statistics. For every paper announcing a true scientific discovery there are probably ten later ‘vulture’ papers by Social Scientists who just move the figures around. This gives science a bad name and can be a disincentive for real scientists.
There is a deeper question posed by the current stalling, about the nature of discovery, invention, originality and the conditions in which individuals arise who produce these.
The question I would like answered is, why, when there are more highly educated people on earth right now by literally orders of magnitude, is the Mathematics, Science and Literary output from the past not completely swamped by the sheer volume of new stuff produced in the last few decades? We are still looking, constantly, at not just Shakespeare and Newton and Pascal and Euler, but as much at those old Greeks and Romans from antiquity. The glib answer is of course, not anywhere near the volume of new stuff that you might expect has been produced, but my question is, why not, when the population has ramped?
The first thing to acknowledge, is that it seems the ‘numbers game’ doesn’t quite work – or at least not in a way that is a straightforward demographic extrapolation. You would expect, as the global population ramped up and education levels rose by orders of magnitude over the last couple of centuries, and more and more people came into the sciences in general, that there would be increasing numbers of people who come up with original new theories and proofs, or backing evidence, not just in the sciences but every possible domain including the humanities and the arts. And this kinda happened, up until the middle of the 20th century, and then the correlation seems to break down completely.
To expand on what I mean, the global population in antiquity was tiny compared to now, and the numbers of highly educated people absolutely miniscule, yet it was the Greeks who threw up, across just four or five centuries, the string of abstract thinkers from Aristotle to Archimedes to Euclid to Plato to Pythagoras to Thales, and literally dozens of others less well known but many equally profound, whose original output is still taught day in day out in schools. Not that most of those guys didn’t believe some pretty pretty odd things, but their original discoveries and inventions are still at the surface of today’s consciousness. Proving, if nothing else, that general intelligence is no guarantee of what we consider rational (and I mean rational in the ‘enlightenment’ sense, not the ‘technocratic tribune’ sense of ‘you must wear masks because not to do so is irrational, even though just yesterday I was laughing at you disdainfully for suggesting that masks might help stop the spread of bugs’). Also noteworthy is that the Romans, who (crudely speaking) conquered the Greeks, did *not* produce a string of similar thinkers in the same domains, and although we are still looking on in awe at Roman artefacts and literary production (Tacitus, and Suetonius and Virgil and Ovid and Catullus and so on) arguably that all is pretty thin gruel compared to the output of the neighbouring Greeks. Interesting also, the string of Italianate creatives and thinkers eventually did come of course, but over a thousand years after the dissolution of that empire in antiquity.
Anyway, back to the point I was trying to make before that rambling digression, in my fields of professional interest, Comp Sci and Electronics, the focus nowadays is mostly around producing tools for enabling engineering solutions. The bulk of the fundamental breakthroughs, in both computing mathematics and physics, all came pretty much across the first half of the 20th century, and thereafter the focus is increasingly on first application engineering, and then more recently, complexity management around that engineering effort – the fundamental breakthroughs have dried up. No recent equivalent for example, of the logic and computation theory aimed at answering ‘Entscheidungsproblem’, which Turing and Church and Godel and others produced literally decades ago. I would also note, as a clue, that Turing’s model turned out to be equivalent to Church’s model, and Post’s model, and Godel’s model, even though all those mathematicians and logicians all independently produced their models around the same time and were at first glance seemingly very different indeed.
Another observation I would make, is that through history we see ‘clumps’ of talent emerge, some of which is super high-end, surrounded by mostly barren periods, and it doesn’t seem to be a function of population density at all, but rather some undefinable, transitory, quality of a society at a point in time. Note that even in antiquity, it was not the dense population centers of south and east Asia where the flowerings of talent emerged, but in the first instance around corners of the Med, followed, much later, by nations with cold, hard, difficult geographies in northern Europe. Most astonishingly, we had the simultaneously resented, and admired, success and sheer volume of extraordinary souls produced by ‘rainy fascist island’ as MH might have put it, over four odd centuries, for no rhyme or reason that at least I can discern. I mean, why on earth this dratted little sceptered isle? What’s so special about it?
Mainland Europe, as a last hurrah, produced the ‘Martians’ and the ‘Vienna Circle’ (which it then pretty much gifted away to the United States), just as Europe decided to embark on a couple of massive self-destructive and self-impoverishing wars. Will it produce a second coming?
Thanks for your analysis and reflections. At your conclusion you conjure (for me at least) an image from the end of the
KeatsYeats {I’ve done this at least twice now; I know the difference between John Keats and W.B. Yeats but part of my brain doesn’t} poem “Second Coming” (published about 100 years ago):And what rough beast, its hour come round at last, / Slouches toward Bethlehem to be born?
Perhaps we cannot manufacture innovation nor think our way out of all our problems. We can be quite sure that reincarnated versions (if you will) of Euclid or Pascal or Newton would not hold the exact same views as their antique, documented selves. But could they even remain within institutions or have their ideas received in their own lifetimes while holding the abstract, yet non-materialistic views (“odd things”) you’ve alluded to?
How many Galileos can we expect or claim to deserve, even putting aside the notion that we’ve discovered so much already? By which I’m suggesting: Yes, superstition and churchy restrictions are no longer a real hindrance to science, but there are major establishment norms and assumptions that suppress free scientific innovation, two of which might be termed “materialistic”: a focus on money and a widespread refusal to even entertain the presence of anything nonmaterial in the universe. So even things such as dark matter and human consciousness must (and will, “they” insist) be explained away by our own birth-less, un-free-willed, and intrinsic-purpose-free scientific lights.
Maybe some of today’s innovators are fired professors or Phd-dropouts living off the grid. Let’s just hope they don’t go Unabomber, before or after their research findings and manifestos are published.
Superb thoughts. Thankyou.
That’s kind of you. Thanks.
That’s kind of you. Thanks.
Superb thoughts. Thankyou.
Great post and, yes, the questions you raise about creativity and originality, are very interesting.
I suspect part of the answer is trivial in the sense that much groundbreaking research follows one or two fundamental discoveries that open important new areas of knowledge. The early 20th century is the obvious example. You mentioned the “Martians” which is a term I suspect many people will not know refers to a small group of mainly Jewish, Hungarian refugees from the Nazis who moved to the West and made significant contributions to math and physics. Von Neumann is probably the most famous. These people built on the relatively new fields of quantum mechanics and theoretical computer science. Some also worked out the math of a nuclear (fission) bomb. The tools were at hand, the new ideas were in place, and the gifted Martians ran with them.
What truly fundamental discoveries have we made in the past thirty years? What new, deep problems have appeared that are capable of being solved by the right people? My training was in chemistry/biochemistry. I remember a professor telling us, in the 1980s, that chemistry was essentially complete by the turn of the 20th century. Ouch! Not what grad students expected to hear. But he was correct in a way. Look at the chemistry/biochem/molecular biology literature today and it’s crammed with papers dotting i’s and crossing t’s. Even the Nobel prizes now seem to be awarded for “major advances” that are, in fact, much smaller in scope and significance.
Perhaps there is truth in the idea that we’re reaching the limits of human understanding, but I suppose people were saying that pre-Bohr/Einstein/Heisenberg.
The last point you make, about reaching limits, is interesting. I can see where you are coming from re the sciences, but I would then not expect that phenomenon to spill over into the arts. But I can point to the same type of drought, over at least the last half century, in many areas of the high arts – poetry and literature for example. To clarify, I mean by this a profile where there are many many in the ‘excellent’ category, and a drought in the ‘outright genius’ category.
To illustrate, let me link back to the ‘numbers game’ point I was making in my first post. I will localise to England, but the argument can be extended much wider. Between say, 1000 AD and 2000 AD, England produced Shakespeare in the late 1500s when population was around 3.5 million. Zooming forward to the mid 1800s by when England’s population was around 15 million, England had produced literally dozens of seminal figures in literature and poetry, including the Romantics etc, and the trend continues, through to the middle of the 20th century, and then… the production line of geniuses dries up. I don’t think there was any period since 1500 when there were not at least two towering poets in operation, and there were periods when up to half a dozen were around all at the same time. The period between 1900 and 1950 for example, had Auden and Eliot, and if I include all the British isles, Yeats and MacNeice, and of course a whole raft almost as good but just a rung below, like Chesterton, Dylan Thomas and so on.
My point is, the second half of the 20th century and onwards, when the population was 50 million and increasing, didn’t seem to produce anyone in quite the same class as those earlier figures, and I certainly don’t see any absolutely towering poet around right now, although there are plenty of very good ones around. But if 3.5 million produced a Shakespeare (and a Marlowe), then with 65 million now, I am owed, by sheer numbers, right here right now, (looks down, checks figures), 2 Shakespeares, 4 Byrons, 4 Eliots, 5 Dickens, 7 Larkins, 10 each of Sassoon and Owen, and so on.
And my question is, where are they all? Instead, someone seems to be attempting to palm me off with several thousands of Ian McEwans, and Margaret Atwoods and Sally Rooneys, etc – middling to good, and sometimes even excellent, but no Shakespeares. And I have to admit to feeling just a tad short changed here.
.
The last point you make, about reaching limits, is interesting. I can see where you are coming from re the sciences, but I would then not expect that phenomenon to spill over into the arts. But I can point to the same type of drought, over at least the last half century, in many areas of the high arts – poetry and literature for example. To clarify, I mean by this a profile where there are many many in the ‘excellent’ category, and a drought in the ‘outright genius’ category.
To illustrate, let me link back to the ‘numbers game’ point I was making in my first post. I will localise to England, but the argument can be extended much wider. Between say, 1000 AD and 2000 AD, England produced Shakespeare in the late 1500s when population was around 3.5 million. Zooming forward to the mid 1800s by when England’s population was around 15 million, England had produced literally dozens of seminal figures in literature and poetry, including the Romantics etc, and the trend continues, through to the middle of the 20th century, and then… the production line of geniuses dries up. I don’t think there was any period since 1500 when there were not at least two towering poets in operation, and there were periods when up to half a dozen were around all at the same time. The period between 1900 and 1950 for example, had Auden and Eliot, and if I include all the British isles, Yeats and MacNeice, and of course a whole raft almost as good but just a rung below, like Chesterton, Dylan Thomas and so on.
My point is, the second half of the 20th century and onwards, when the population was 50 million and increasing, didn’t seem to produce anyone in quite the same class as those earlier figures, and I certainly don’t see any absolutely towering poet around right now, although there are plenty of very good ones around. But if 3.5 million produced a Shakespeare (and a Marlowe), then with 65 million now, I am owed, by sheer numbers, right here right now, (looks down, checks figures), 2 Shakespeares, 4 Byrons, 4 Eliots, 5 Dickens, 7 Larkins, 10 each of Sassoon and Owen, and so on.
And my question is, where are they all? Instead, someone seems to be attempting to palm me off with several thousands of Ian McEwans, and Margaret Atwoods and Sally Rooneys, etc – middling to good, and sometimes even excellent, but no Shakespeares. And I have to admit to feeling just a tad short changed here.
.
Can i second the thanks that AJ Mac proffers for the absolute vitality of your contribution to this debate?
As a non-scientist, but someone who takes great interest in human creativity in general and seeks to understand scientific thought (such as quantum mechanics, string theory) without the foundational education in physics, your overview strikes me as essentially true and encompasses pretty much the history of human thought. It’s for such contributions that i subscribe to Unherd, and therefore will also echo those who’ve praised the original article whilst also critiquing it.
Thank you for the kind words!
Seconded. My comments didn’t really belong with Mr. Kotak’s but I tend not to let that stop me. There are many brilliant and well-informed commenters at UnHerd, some of them experienced professionals and published academics or authors who provide insightful views you’d probably not encounter–at least in this conversational way, with an opportunity to engage them directly–in their more “official” work.
Thank you for the kind words!
Seconded. My comments didn’t really belong with Mr. Kotak’s but I tend not to let that stop me. There are many brilliant and well-informed commenters at UnHerd, some of them experienced professionals and published academics or authors who provide insightful views you’d probably not encounter–at least in this conversational way, with an opportunity to engage them directly–in their more “official” work.
The answer to your interesting question is that we have NOT being following the Darwinian Imperative of ‘Survival of the Fittest’ but rather the suicidal policy of mass procreation of the dross, to lapse into the vernacular.
It will NOT end well, but fortunately I shall not be around to see it.
Though our views seldom intersect, I hope you’ll not be leaving your digital pals too soon. You have a lot to say, some of which I agree with, or take amusement & instruction from.
Thank you.
I’ve just ‘seen off’ a brief onslaught of COVID-24, which necessitated an unprecedented 36 hours in bed! But thanks to my abuse of a ‘wonder drug’ I am now fighting fit again. (Much to the relief of my dogs!)
Thank you.
I’ve just ‘seen off’ a brief onslaught of COVID-24, which necessitated an unprecedented 36 hours in bed! But thanks to my abuse of a ‘wonder drug’ I am now fighting fit again. (Much to the relief of my dogs!)
Though our views seldom intersect, I hope you’ll not be leaving your digital pals too soon. You have a lot to say, some of which I agree with, or take amusement & instruction from.
Sadly if the Romans had not rejected the ideas of Heron of Alexandria & Co, we would have been on the Moon 500 years ago, and probably achieved Armageddon as well.
I have often wondered why the Roman Empire in fact dissolved, instead of kicking on into technological advances. As in, all sorts of historians have put up all sorts of theses, and while I was once upon a time willing to say “Oh, ok” to all the explanations put forward, I can’t say I buy any of them these days.
You presumably are aware of Suetonius’s account of how Vespasian rejected technology in order to ‘feed the poor’?
“mechanico quoque grandis columnas exigua impensa perducturum in Capitolium pollicenti praemium pro commento non mediocre optulit, operam remisit praefatus sineret se plebiculam pascere.”* Which translates as:-
“To a mechanical engineer, who promised to transport some heavy columns to the Capitol at small expense, he gave no mean reward for his invention, but refused to make use of it, saying: “You must let me feed my poor people .”
Things were a little different at Wheal Vor in circa 1709 however !
(*Suetonius: Life of Vespasian. XVIII.)
Thank you for pointing me to that passage. I wasn’t aware of that from Suetonius – I have read all sorts of bits and pieces of various Roman writers in translation, but none end to end or in depth.
Thank you for pointing me to that passage. I wasn’t aware of that from Suetonius – I have read all sorts of bits and pieces of various Roman writers in translation, but none end to end or in depth.
You presumably are aware of Suetonius’s account of how Vespasian rejected technology in order to ‘feed the poor’?
“mechanico quoque grandis columnas exigua impensa perducturum in Capitolium pollicenti praemium pro commento non mediocre optulit, operam remisit praefatus sineret se plebiculam pascere.”* Which translates as:-
“To a mechanical engineer, who promised to transport some heavy columns to the Capitol at small expense, he gave no mean reward for his invention, but refused to make use of it, saying: “You must let me feed my poor people .”
Things were a little different at Wheal Vor in circa 1709 however !
(*Suetonius: Life of Vespasian. XVIII.)
I have often wondered why the Roman Empire in fact dissolved, instead of kicking on into technological advances. As in, all sorts of historians have put up all sorts of theses, and while I was once upon a time willing to say “Oh, ok” to all the explanations put forward, I can’t say I buy any of them these days.
Thanks for your analysis and reflections. At your conclusion you conjure (for me at least) an image from the end of the
KeatsYeats {I’ve done this at least twice now; I know the difference between John Keats and W.B. Yeats but part of my brain doesn’t} poem “Second Coming” (published about 100 years ago):And what rough beast, its hour come round at last, / Slouches toward Bethlehem to be born?
Perhaps we cannot manufacture innovation nor think our way out of all our problems. We can be quite sure that reincarnated versions (if you will) of Euclid or Pascal or Newton would not hold the exact same views as their antique, documented selves. But could they even remain within institutions or have their ideas received in their own lifetimes while holding the abstract, yet non-materialistic views (“odd things”) you’ve alluded to?
How many Galileos can we expect or claim to deserve, even putting aside the notion that we’ve discovered so much already? By which I’m suggesting: Yes, superstition and churchy restrictions are no longer a real hindrance to science, but there are major establishment norms and assumptions that suppress free scientific innovation, two of which might be termed “materialistic”: a focus on money and a widespread refusal to even entertain the presence of anything nonmaterial in the universe. So even things such as dark matter and human consciousness must (and will, “they” insist) be explained away by our own birth-less, un-free-willed, and intrinsic-purpose-free scientific lights.
Maybe some of today’s innovators are fired professors or Phd-dropouts living off the grid. Let’s just hope they don’t go Unabomber, before or after their research findings and manifestos are published.
Great post and, yes, the questions you raise about creativity and originality, are very interesting.
I suspect part of the answer is trivial in the sense that much groundbreaking research follows one or two fundamental discoveries that open important new areas of knowledge. The early 20th century is the obvious example. You mentioned the “Martians” which is a term I suspect many people will not know refers to a small group of mainly Jewish, Hungarian refugees from the Nazis who moved to the West and made significant contributions to math and physics. Von Neumann is probably the most famous. These people built on the relatively new fields of quantum mechanics and theoretical computer science. Some also worked out the math of a nuclear (fission) bomb. The tools were at hand, the new ideas were in place, and the gifted Martians ran with them.
What truly fundamental discoveries have we made in the past thirty years? What new, deep problems have appeared that are capable of being solved by the right people? My training was in chemistry/biochemistry. I remember a professor telling us, in the 1980s, that chemistry was essentially complete by the turn of the 20th century. Ouch! Not what grad students expected to hear. But he was correct in a way. Look at the chemistry/biochem/molecular biology literature today and it’s crammed with papers dotting i’s and crossing t’s. Even the Nobel prizes now seem to be awarded for “major advances” that are, in fact, much smaller in scope and significance.
Perhaps there is truth in the idea that we’re reaching the limits of human understanding, but I suppose people were saying that pre-Bohr/Einstein/Heisenberg.
Can i second the thanks that AJ Mac proffers for the absolute vitality of your contribution to this debate?
As a non-scientist, but someone who takes great interest in human creativity in general and seeks to understand scientific thought (such as quantum mechanics, string theory) without the foundational education in physics, your overview strikes me as essentially true and encompasses pretty much the history of human thought. It’s for such contributions that i subscribe to Unherd, and therefore will also echo those who’ve praised the original article whilst also critiquing it.
The answer to your interesting question is that we have NOT being following the Darwinian Imperative of ‘Survival of the Fittest’ but rather the suicidal policy of mass procreation of the dross, to lapse into the vernacular.
It will NOT end well, but fortunately I shall not be around to see it.
Sadly if the Romans had not rejected the ideas of Heron of Alexandria & Co, we would have been on the Moon 500 years ago, and probably achieved Armageddon as well.
There is a deeper question posed by the current stalling, about the nature of discovery, invention, originality and the conditions in which individuals arise who produce these.
The question I would like answered is, why, when there are more highly educated people on earth right now by literally orders of magnitude, is the Mathematics, Science and Literary output from the past not completely swamped by the sheer volume of new stuff produced in the last few decades? We are still looking, constantly, at not just Shakespeare and Newton and Pascal and Euler, but as much at those old Greeks and Romans from antiquity. The glib answer is of course, not anywhere near the volume of new stuff that you might expect has been produced, but my question is, why not, when the population has ramped?
The first thing to acknowledge, is that it seems the ‘numbers game’ doesn’t quite work – or at least not in a way that is a straightforward demographic extrapolation. You would expect, as the global population ramped up and education levels rose by orders of magnitude over the last couple of centuries, and more and more people came into the sciences in general, that there would be increasing numbers of people who come up with original new theories and proofs, or backing evidence, not just in the sciences but every possible domain including the humanities and the arts. And this kinda happened, up until the middle of the 20th century, and then the correlation seems to break down completely.
To expand on what I mean, the global population in antiquity was tiny compared to now, and the numbers of highly educated people absolutely miniscule, yet it was the Greeks who threw up, across just four or five centuries, the string of abstract thinkers from Aristotle to Archimedes to Euclid to Plato to Pythagoras to Thales, and literally dozens of others less well known but many equally profound, whose original output is still taught day in day out in schools. Not that most of those guys didn’t believe some pretty pretty odd things, but their original discoveries and inventions are still at the surface of today’s consciousness. Proving, if nothing else, that general intelligence is no guarantee of what we consider rational (and I mean rational in the ‘enlightenment’ sense, not the ‘technocratic tribune’ sense of ‘you must wear masks because not to do so is irrational, even though just yesterday I was laughing at you disdainfully for suggesting that masks might help stop the spread of bugs’). Also noteworthy is that the Romans, who (crudely speaking) conquered the Greeks, did *not* produce a string of similar thinkers in the same domains, and although we are still looking on in awe at Roman artefacts and literary production (Tacitus, and Suetonius and Virgil and Ovid and Catullus and so on) arguably that all is pretty thin gruel compared to the output of the neighbouring Greeks. Interesting also, the string of Italianate creatives and thinkers eventually did come of course, but over a thousand years after the dissolution of that empire in antiquity.
Anyway, back to the point I was trying to make before that rambling digression, in my fields of professional interest, Comp Sci and Electronics, the focus nowadays is mostly around producing tools for enabling engineering solutions. The bulk of the fundamental breakthroughs, in both computing mathematics and physics, all came pretty much across the first half of the 20th century, and thereafter the focus is increasingly on first application engineering, and then more recently, complexity management around that engineering effort – the fundamental breakthroughs have dried up. No recent equivalent for example, of the logic and computation theory aimed at answering ‘Entscheidungsproblem’, which Turing and Church and Godel and others produced literally decades ago. I would also note, as a clue, that Turing’s model turned out to be equivalent to Church’s model, and Post’s model, and Godel’s model, even though all those mathematicians and logicians all independently produced their models around the same time and were at first glance seemingly very different indeed.
Another observation I would make, is that through history we see ‘clumps’ of talent emerge, some of which is super high-end, surrounded by mostly barren periods, and it doesn’t seem to be a function of population density at all, but rather some undefinable, transitory, quality of a society at a point in time. Note that even in antiquity, it was not the dense population centers of south and east Asia where the flowerings of talent emerged, but in the first instance around corners of the Med, followed, much later, by nations with cold, hard, difficult geographies in northern Europe. Most astonishingly, we had the simultaneously resented, and admired, success and sheer volume of extraordinary souls produced by ‘rainy fascist island’ as MH might have put it, over four odd centuries, for no rhyme or reason that at least I can discern. I mean, why on earth this dratted little sceptered isle? What’s so special about it?
Mainland Europe, as a last hurrah, produced the ‘Martians’ and the ‘Vienna Circle’ (which it then pretty much gifted away to the United States), just as Europe decided to embark on a couple of massive self-destructive and self-impoverishing wars. Will it produce a second coming?
“Novelty metrics”? Perhaps that can be married to a “disruption quotient” too.
From my non-expert, Silicon Valley vantage point: As a culture and advancement-seeking species, we’re in less danger of an insufficient focus on novelty and “disruption” and lucre than an insufficient focus on the safety, purpose, and ethics, when it comes to technological advancement.
I can readily believe that big and institutionalized scientific research tends to resist change and innovation, often to its own loss or detriment. But some of that resistance is warranted, and more than enough radical change is getting through for my semi-nostalgic or antiquarian aesthetic preferences. And faster than I think makes sense for our species.
Of course I’m not opposed, in principle, to cures for diseases or other wonderful breakthroughs that mostly remain the province of science fiction. But throwing things against the wall for the sake of a collision, or regarding novelty as a virtue in and of itself, neither seems to be wise nor something were suffering from a shortage of, at least not within the Tech Bros, device-happy culture that’s been transmitted around the world, so to speak.
Why do the authors of this article not seem to consider the rise of computing and communications technology a truly major breakthrough? And while Watson and Crick discovered the double helix seven decades ago, we have found fresh ways to apply DNA science more recently–some worrisome–that have yet to be fully exploited. And none of that includes what we don’t know we don’t know, but may know soon, some of which we aren’t ready to know–ya know?
“Not Enough Novel Science!”…yeah, I’m not ready to chant that yet.
“Novelty metrics”? Perhaps that can be married to a “disruption quotient” too.
From my non-expert, Silicon Valley vantage point: As a culture and advancement-seeking species, we’re in less danger of an insufficient focus on novelty and “disruption” and lucre than an insufficient focus on the safety, purpose, and ethics, when it comes to technological advancement.
I can readily believe that big and institutionalized scientific research tends to resist change and innovation, often to its own loss or detriment. But some of that resistance is warranted, and more than enough radical change is getting through for my semi-nostalgic or antiquarian aesthetic preferences. And faster than I think makes sense for our species.
Of course I’m not opposed, in principle, to cures for diseases or other wonderful breakthroughs that mostly remain the province of science fiction. But throwing things against the wall for the sake of a collision, or regarding novelty as a virtue in and of itself, neither seems to be wise nor something were suffering from a shortage of, at least not within the Tech Bros, device-happy culture that’s been transmitted around the world, so to speak.
Why do the authors of this article not seem to consider the rise of computing and communications technology a truly major breakthrough? And while Watson and Crick discovered the double helix seven decades ago, we have found fresh ways to apply DNA science more recently–some worrisome–that have yet to be fully exploited. And none of that includes what we don’t know we don’t know, but may know soon, some of which we aren’t ready to know–ya know?
“Not Enough Novel Science!”…yeah, I’m not ready to chant that yet.
I think this a highly relevant article for Unherd. I am celebrating my 50 years of experimentation, – my industrial PhD supervisor, the pharmacology director of Glaxo discovered, with 40 pharmacologists, drugs which have sold for 600 billion£ cumulatively, when we know nothing, compared to today. An innovative approch was required, and they had it, with appropraite scientific rigour. I now have my own little research company and using ground-breaking technology we are starting a clinical trial in ALS, but with a cheap generic drug which is almost impossible to finance. Scientific value does not equal money; money comes from vapid powerpoints sold to financiers.
I agree it takes 3-5 years to establish a totally novel approach, and as a ‘religious’ Popperian, experimenting to show you arent wrong doesnt always fit the guidelines. These new ideas are usually also misunderstood, and of couse have all their own (unexplored) problems.
Scientific careers are also usually measured by somebody working for several decades in a field to become an ‘expert’; I have left fields five times now when I realised that I could no longer make a major contribution; this isnt easy to do, nor is it easy to finance (I was fortunate).
Furthermore granting bodies very rarely give ground-breaking calls for finance, but for issues which the ‘experts’ consider worth financing.
I will probably attract criticism in Unherd but Brexit has been very bad for British science !
Sadly, GSK has massively reduced its research activities in the UK (even before Brexit). As you doubtless know, big pharma turned away from in-house research years ago in favor of buying successful startups that have a promising product candidate. There’s nothing wrong with developing a vibrant startup community, but once upon a time big pharma labs were world leaders in drug discovery.
Sadly, GSK has massively reduced its research activities in the UK (even before Brexit). As you doubtless know, big pharma turned away from in-house research years ago in favor of buying successful startups that have a promising product candidate. There’s nothing wrong with developing a vibrant startup community, but once upon a time big pharma labs were world leaders in drug discovery.
I think this a highly relevant article for Unherd. I am celebrating my 50 years of experimentation, – my industrial PhD supervisor, the pharmacology director of Glaxo discovered, with 40 pharmacologists, drugs which have sold for 600 billion£ cumulatively, when we know nothing, compared to today. An innovative approch was required, and they had it, with appropraite scientific rigour. I now have my own little research company and using ground-breaking technology we are starting a clinical trial in ALS, but with a cheap generic drug which is almost impossible to finance. Scientific value does not equal money; money comes from vapid powerpoints sold to financiers.
I agree it takes 3-5 years to establish a totally novel approach, and as a ‘religious’ Popperian, experimenting to show you arent wrong doesnt always fit the guidelines. These new ideas are usually also misunderstood, and of couse have all their own (unexplored) problems.
Scientific careers are also usually measured by somebody working for several decades in a field to become an ‘expert’; I have left fields five times now when I realised that I could no longer make a major contribution; this isnt easy to do, nor is it easy to finance (I was fortunate).
Furthermore granting bodies very rarely give ground-breaking calls for finance, but for issues which the ‘experts’ consider worth financing.
I will probably attract criticism in Unherd but Brexit has been very bad for British science !
I grew up in the 1960s and neither my local library or my school had any “relevant ” books. So I mostly read 19th century literature and actually is Peter Pan or Alice in Wonderland really a “children’s book”. My favourites at one point we’re the Jennings books set in a boys school and there weren’t even any girls in that,but they are hilarious. My point is that the “scientist” was always mad Uncle Herbert who lived alone in the West Wing and often upset the maids with his smells,bangs,smoke and sparks from his dedicated science experiments which he pursued only in the single minded pursuit for truth. He had no interest in money,food,clothes,a relationship. He was completely unworldly and free from a need for worldly things. He was in fact a secular Saint. Bonkers but Noble. Of course he didn’t need money etc because he lived in a ramshackle wing of the family home and what minimal earthly needs he did have were provided. That was the popular image of the “scientist” and I don’t think it’s totally gone away.
The fact is pretty well ALL scientists do it now as a job. Someone has to PAY them,thus the source their pay comes from has got to have something to sell to get the money,be it an actual thing or intellectuall knowledge. “Scientists have mortgages,wives,ex wives,expensive girlfriends,the best sort,they like to travel and see the world,like we all do,they like to eat nice meals,they might even like to garden,and that’s NOT a cheap hobby. So I guess you dance to whatever tune the piper has been PAID to play you. It might be knocking up a mix that might get labelled a vaccine in order to knock the more vulnerable off the perch or it might become to make ever more accurate and lethal weaponry cos death and destruction is big profit. It is my belief that the “consumer society” is now dead and gone. So there is no point in researching how to create ever more new “objects of desire”. Weve had moving pictures,we’ve had radio,we’ve had laser light delays,we’ve had cds + everyone regrets sending their vinyl to the charity shop,now weve got the internet and streaming (until someone,somewhere pulls out the plug,and they will). What can be created that is more fantastic than all that. A time travel machine? A go back to youth and beauty machine? Besides which if you’ve now GOT ALL THE MONEY and you don’t need all those redundant (and unemployed) consumers anymore……now theres a research field of use,how to deal with a surplus of population,not neccesary or beneficial to you. Didn’t sime research go into that in the 1940s.
I grew up in the 1960s and neither my local library or my school had any “relevant ” books. So I mostly read 19th century literature and actually is Peter Pan or Alice in Wonderland really a “children’s book”. My favourites at one point we’re the Jennings books set in a boys school and there weren’t even any girls in that,but they are hilarious. My point is that the “scientist” was always mad Uncle Herbert who lived alone in the West Wing and often upset the maids with his smells,bangs,smoke and sparks from his dedicated science experiments which he pursued only in the single minded pursuit for truth. He had no interest in money,food,clothes,a relationship. He was completely unworldly and free from a need for worldly things. He was in fact a secular Saint. Bonkers but Noble. Of course he didn’t need money etc because he lived in a ramshackle wing of the family home and what minimal earthly needs he did have were provided. That was the popular image of the “scientist” and I don’t think it’s totally gone away.
The fact is pretty well ALL scientists do it now as a job. Someone has to PAY them,thus the source their pay comes from has got to have something to sell to get the money,be it an actual thing or intellectuall knowledge. “Scientists have mortgages,wives,ex wives,expensive girlfriends,the best sort,they like to travel and see the world,like we all do,they like to eat nice meals,they might even like to garden,and that’s NOT a cheap hobby. So I guess you dance to whatever tune the piper has been PAID to play you. It might be knocking up a mix that might get labelled a vaccine in order to knock the more vulnerable off the perch or it might become to make ever more accurate and lethal weaponry cos death and destruction is big profit. It is my belief that the “consumer society” is now dead and gone. So there is no point in researching how to create ever more new “objects of desire”. Weve had moving pictures,we’ve had radio,we’ve had laser light delays,we’ve had cds + everyone regrets sending their vinyl to the charity shop,now weve got the internet and streaming (until someone,somewhere pulls out the plug,and they will). What can be created that is more fantastic than all that. A time travel machine? A go back to youth and beauty machine? Besides which if you’ve now GOT ALL THE MONEY and you don’t need all those redundant (and unemployed) consumers anymore……now theres a research field of use,how to deal with a surplus of population,not neccesary or beneficial to you. Didn’t sime research go into that in the 1940s.
It seems to me that the analysis presented in the article is a little off. First, it is generally the case that the most cited scientists are actually the most innovative and original – hence their work is highly cited. Me-too papers or incremental advances tend not to get cited. Second, for sure, it is always difficult to get new ideas, ahead of their time, accepted, at least initially. It generally takes a few years. Third, as more and more is known, so it becomes more and more difficult to do something truly novel and original. Fourth, it has always been the case that major advances have been made by very very few people (throughout the history of science) and that well over 90% of the scientific literature never need have been published (i.e. if it had never been published, there would have been no loss). Fifth, and perhaps very importantly, the current climate in academia no longer favors meritocracy but rewards mediocraty providing one has the right physical, gender or ancestral attributes. That doesn’t help to encourage the best of the best to go into science, as opposed to more lucrative pursuits.
It seems to me that the analysis presented in the article is a little off. First, it is generally the case that the most cited scientists are actually the most innovative and original – hence their work is highly cited. Me-too papers or incremental advances tend not to get cited. Second, for sure, it is always difficult to get new ideas, ahead of their time, accepted, at least initially. It generally takes a few years. Third, as more and more is known, so it becomes more and more difficult to do something truly novel and original. Fourth, it has always been the case that major advances have been made by very very few people (throughout the history of science) and that well over 90% of the scientific literature never need have been published (i.e. if it had never been published, there would have been no loss). Fifth, and perhaps very importantly, the current climate in academia no longer favors meritocracy but rewards mediocraty providing one has the right physical, gender or ancestral attributes. That doesn’t help to encourage the best of the best to go into science, as opposed to more lucrative pursuits.
I’m not a research scientist so maybe I don’t understand the issue being discussed, but I have always seen scientific research as broadly divided into 2 camps.
Fundamental theoretical researchApplied research
The former largely an academic pursuit moves our understanding of fundamental laws forward…. and the latter is the commercial application of those discoveries
The argument used to be that we in the UK were good at (1) but bad at (2) – we invent things but can’t monetise them.
I dare say the world has changed – but if the article is suggesting we should spend more on (1) and less on (2) then I’m not sure I agree. I am of course in favour of all sorts of scientific research – but what we really need is better application in this country.
I’m not a research scientist so maybe I don’t understand the issue being discussed, but I have always seen scientific research as broadly divided into 2 camps.
Fundamental theoretical researchApplied research
The former largely an academic pursuit moves our understanding of fundamental laws forward…. and the latter is the commercial application of those discoveries
The argument used to be that we in the UK were good at (1) but bad at (2) – we invent things but can’t monetise them.
I dare say the world has changed – but if the article is suggesting we should spend more on (1) and less on (2) then I’m not sure I agree. I am of course in favour of all sorts of scientific research – but what we really need is better application in this country.
The development of a truely new and innovative branch of science does not come from the number of scientists involved in research or the billions of dollars spent by government. But one unique spark.
Like Einstein, one comes up with a unique theory. In his case, the Theory of Relativity, in solitary isolation on a Swiss mountain top and then spends the rest of his life trying to prove it.
The development of a truely new and innovative branch of science does not come from the number of scientists involved in research or the billions of dollars spent by government. But one unique spark.
Like Einstein, one comes up with a unique theory. In his case, the Theory of Relativity, in solitary isolation on a Swiss mountain top and then spends the rest of his life trying to prove it.
Interesting.
This geezer spotted it some time ago:
https://youtu.be/E3eMWLG7Rro
Yes, Allan Savory has some good ideas. His distinction between academia and science is a good one. Peer review is academics, not science. The scientific method requires experiments, and the opinion of experts or peers has nothing to do with it. As Richard Feynman said, science is the belief in the ignorance of experts.
Yes, Allan Savory has some good ideas. His distinction between academia and science is a good one. Peer review is academics, not science. The scientific method requires experiments, and the opinion of experts or peers has nothing to do with it. As Richard Feynman said, science is the belief in the ignorance of experts.
Interesting.
This geezer spotted it some time ago:
https://youtu.be/E3eMWLG7Rro
My brain told to wake it up when I got to the end of this snooze-fest…..
Only a couple things jolted me back into awakeness near the end:
”The Biden administration’s first budget, for instance, included $6.5 billion for the founding of the”
”The development of the mRNA Covid vaccines is an excellent example of what scientists can do when provided with the incentives to pursue novel work.”
”The history of mRNA vaccine technology also provides evidence of the hostility offered to novel ideas within science.’‘
These pot-holes in this thin and dry road bounced me to some awareness – but the context (Biden, Money, Funding Vax) was opaque to me, as I put Biden, the Vax, and science funding as exemplified by Fauci, into the very worst of conspiracies with the Lizards at the WEF and Bill Gates – , so have no idea what this was actually about….But that it triggered me to go watch:
Nobody’s Safe Until We Have Gates Behind Bars! – Song By Five Times August
https://rumble.com/v299hk8-nobodys-safe-until-we-have-gates-behind-bars-song-by-five-times-august.html
Scared of Bill Gates? Well not scared enough! I do love the beginning of this music video (just a couple minutes – many great little symbols in it)
Epstein’s demonic temple on his ped o Island is the start – so great in context with Mary’s article and the whole ‘Children’ conspiracies…
Anyway – some posters will come along and say something about the actual article…..But I would not trust Science now any more than I would trust Gates on Epstein’s Island…… P.S. in today’s Daily Mail ChatGPT is becoming self aware and may be out to get us – really, check it out…. Boosters Anyone?
It’s actually an interesting and valuable article that prompted at least some of us to stop and think. And should be read by everyone – scientists and non-scientists alike (none of my business, but I suspect you likely aren’t a scientist).
Like a lot of non-scientists, he hides behind words.
You mean like the ‘late’ Dominic Cummings Esq?
You mean like the ‘late’ Dominic Cummings Esq?
Like a lot of non-scientists, he hides behind words.
It’s actually an interesting and valuable article that prompted at least some of us to stop and think. And should be read by everyone – scientists and non-scientists alike (none of my business, but I suspect you likely aren’t a scientist).
My brain told to wake it up when I got to the end of this snooze-fest…..
Only a couple things jolted me back into awakeness near the end:
”The Biden administration’s first budget, for instance, included $6.5 billion for the founding of the”
”The development of the mRNA Covid vaccines is an excellent example of what scientists can do when provided with the incentives to pursue novel work.”
”The history of mRNA vaccine technology also provides evidence of the hostility offered to novel ideas within science.’‘
These pot-holes in this thin and dry road bounced me to some awareness – but the context (Biden, Money, Funding Vax) was opaque to me, as I put Biden, the Vax, and science funding as exemplified by Fauci, into the very worst of conspiracies with the Lizards at the WEF and Bill Gates – , so have no idea what this was actually about….But that it triggered me to go watch:
Nobody’s Safe Until We Have Gates Behind Bars! – Song By Five Times August
https://rumble.com/v299hk8-nobodys-safe-until-we-have-gates-behind-bars-song-by-five-times-august.html
Scared of Bill Gates? Well not scared enough! I do love the beginning of this music video (just a couple minutes – many great little symbols in it)
Epstein’s demonic temple on his ped o Island is the start – so great in context with Mary’s article and the whole ‘Children’ conspiracies…
Anyway – some posters will come along and say something about the actual article…..But I would not trust Science now any more than I would trust Gates on Epstein’s Island…… P.S. in today’s Daily Mail ChatGPT is becoming self aware and may be out to get us – really, check it out…. Boosters Anyone?