A new paper published this week provides evidence that the progress of science has considerably slowed over the course of the last few decades. The research — authored by American academics Michael Park, Erin Leahy and Russell J. Funk — builds on a previous study, ‘Are ideas getting harder to find?’, which claims that “research effort is rising substantially while research productivity is declining sharply”.
The findings of the Nature report go against our expectations of scientific research as a process in which prior knowledge facilitates new discoveries, and where disciplines endlessly branch out into further sub-disciplines. Park, Leahy and Funk write that “relative to earlier eras, recent papers and patents do less to push science and technology in new directions”, and attribute the “decline in disruptiveness to a narrowing in the use of previous knowledge”.
This is to say, a larger volume of material is being produced but scientific research is becoming increasingly specialised, to the point of esotericism and to the detriment of significant advances. What’s more, the Nature paper finds that recent studies are “less likely to connect disparate areas of knowledge”. Using data from 45 million papers and 3.9 million patents, it provides examples of pharmaceuticals and semiconductors as areas of study which are regressing.
The authors note that the decline in disruptive research is not so much due to changes in the quality of published science. Rather, it “may reflect a fundamental shift in the nature of science and technology.”
This shift, naturally, will have an impact beyond the sometimes insular world of scientific study. The researchers claim that the “gap between the year of discovery and the awarding of a Nobel Prize has also increased, suggesting that today’s contributions do not measure up to the past”. The implications of this are damaging to the development of health and security policy, and to economic progress more broadly.
How, then, to combat this stasis, and guarantee more significant breakthroughs in medicine, engineering and climate science? The report suggests that scientists should broaden their scope of reference, to ensure cross-disciplinary collaboration, as well as prioritising quality of research over quantity of output, potentially through taking extended sabbaticals.
Working off the idea that “using more diverse work, less of one’s own work and older work tends to be associated with the production of more disruptive science and technology”, the authors recommend that scientists avoid the pitfalls of contemporary group-think and self-reference, and more productively draw on the discoveries of the past.
Join the discussion
Join like minded readers that support our journalism by becoming a paid subscriber
To join the discussion in the comments, become a paid subscriber.
Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.
SubscribeI used to do biopharma research for a living. The main conclusions of this article ring true as do the suggested causes, although the authors sidestepped one major issue.
In academia, grant funding is highly competitive. To secure funding you must be established in the narrow area of research where you’re requesting funding. Your research proposal must be almost guaranteed to yield meaningful results; in essence, you only get funded to answer the next obvious question in a well-characterized area of research. There is also a strong incentive to break results down into “minimum publishable units” to pad your CV. Hence, the scientific literature is bulging with trivial papers.
In the biopharma industry, the major companies have been cutting their own research for decades because their huge, internal bureaucracy stifles innovation. They grow by acquiring smaller companies and those smaller companies de-risk their research programs so far as possible.
Academic and commercial research are now both big businesses, and big business hates risk.
The issue the paper’s authors dance around is censorship. Even before our current age of cancel culture, anyone proposing a radical new scientific theory had better have a stable job, such as academic tenure. Of course, a radical new theory must be rigorously tested, but even if the theory is correct you’re challenging a status quo on which powerful people have built their careers. Science is inherently conservative.
In our modern age of cancel culture, and the progressivism of journals such as Nature where the article we’re currently discussing was published, there’s an increasing list of subjects that can no longer be freely and honestly researched.
Underlying the Nature article is the fascinating question of creativity. What their data purports to show is that, even as the total amount of scientific research increased over the past fifty years or so, the absolute number of truly ground-breaking discoveries remained constant. Is there an inherent limit on originality? Is it possible to foster originality? These questions are beyond the current article but are, for me, highly important.
“Is there an inherent limit on originality?”
Its quite hard to actually define originality and therefore difficult to recognise it. My first thought was that you have to be a little crazy to be one of those people, as if you’re actually unaware of established boundaries. But then I was thinking about the work of Hemingway, how his writing was so original, or James Joyce or Picasso. They took off on their own expedition into the unknown, but I wouldn’t call them crazy. But it does seem to me that originality depends on the individual, and today the true individual seems to have gone to ground.
The idea that pure scientists are a bit crazy is false. They are the only ones who are not, and the rest of us who fail to see the world realistically and therefore have to use imagination to replace facts, are the ones who are a bit strange.
I find that such science as used by these “crazies” contains of necessity two conflicting trends or statements. Hamlet should have claimed “To be and not to be, this is the answer!” Then we might begin to get our approach to realism in order, in what can also be a coldly logical (but oxymoronical) way.
I agree. The crazy I was using is very much in the way you phrased it: “ They are the only ones who are not, and the rest of us who fail to see the world realistically and therefore have to use imagination to replace facts”
Thanks for a refreshing perspective.
I’m very pro science, but it will at least for the foreseeable future be a very limited and inadequate way of understanding human societies. An eclipse can be predicted with enormous accuracy, not so a war!
Some scientists have indeed been pretty eccentric, shall we say, by most ordinary standards. They have also had flashes of insight (‘inspiration’) when not consciously thinking of a problem, though I quite agree that a lot of hard work (the ‘perspiration’) needs to be undertaken first.
I’m very pro science, but it will at least for the foreseeable future be a very limited and inadequate way of understanding human societies. An eclipse can be predicted with enormous accuracy, not so a war!
Some scientists have indeed been pretty eccentric, shall we say, by most ordinary standards. They have also had flashes of insight (‘inspiration’) when not consciously thinking of a problem, though I quite agree that a lot of hard work (the ‘perspiration’) needs to be undertaken first.
I have some sympathy with your points, but science isn’t simply a matter of setting out ‘facts’ but by finding systematic connections between them and thus improving our understanding of the world. That is, I would say, the physical world mainly.
While we can do legitimate social science and psychological research, human beings and societies are vastly more complex ’emergent phenomena’ than say, quantum interactions. We are ourselves very aware of our relationship with other human beings, question their benevolence and motives, and are on a very deep evolutionary level tribal in mature. All this and many other factors influences in a constant iterative way any social science studies. This in turn makes robust hard and fast ‘scientific’ laws very much more difficult to obtain. Such subjects as economics are littered with a kind of hard science envy, but have traditionally depended to an almost silly degree on dubious assumptions of ‘rational’ human behaviour.
When it comes to religion, political beliefs etc, the role of science seems even less directly relevant. You can’t really explain wars by a scientific theory, but we do nonetheless need to better understand their causes and how best to bring them to a close.
I agree. The crazy I was using is very much in the way you phrased it: “ They are the only ones who are not, and the rest of us who fail to see the world realistically and therefore have to use imagination to replace facts”
Thanks for a refreshing perspective.
I have some sympathy with your points, but science isn’t simply a matter of setting out ‘facts’ but by finding systematic connections between them and thus improving our understanding of the world. That is, I would say, the physical world mainly.
While we can do legitimate social science and psychological research, human beings and societies are vastly more complex ’emergent phenomena’ than say, quantum interactions. We are ourselves very aware of our relationship with other human beings, question their benevolence and motives, and are on a very deep evolutionary level tribal in mature. All this and many other factors influences in a constant iterative way any social science studies. This in turn makes robust hard and fast ‘scientific’ laws very much more difficult to obtain. Such subjects as economics are littered with a kind of hard science envy, but have traditionally depended to an almost silly degree on dubious assumptions of ‘rational’ human behaviour.
When it comes to religion, political beliefs etc, the role of science seems even less directly relevant. You can’t really explain wars by a scientific theory, but we do nonetheless need to better understand their causes and how best to bring them to a close.
Your comment puts me in mind of this quote:
Rockefeller’s General Education Board, from a document called “Occasional Letter Number One”:
The way that’s written I’m betting it’s not from the present.
The way that’s written I’m betting it’s not from the present.
The idea that pure scientists are a bit crazy is false. They are the only ones who are not, and the rest of us who fail to see the world realistically and therefore have to use imagination to replace facts, are the ones who are a bit strange.
I find that such science as used by these “crazies” contains of necessity two conflicting trends or statements. Hamlet should have claimed “To be and not to be, this is the answer!” Then we might begin to get our approach to realism in order, in what can also be a coldly logical (but oxymoronical) way.
Your comment puts me in mind of this quote:
Rockefeller’s General Education Board, from a document called “Occasional Letter Number One”:
“Is there an inherent limit on originality?”
Its quite hard to actually define originality and therefore difficult to recognise it. My first thought was that you have to be a little crazy to be one of those people, as if you’re actually unaware of established boundaries. But then I was thinking about the work of Hemingway, how his writing was so original, or James Joyce or Picasso. They took off on their own expedition into the unknown, but I wouldn’t call them crazy. But it does seem to me that originality depends on the individual, and today the true individual seems to have gone to ground.
I used to do biopharma research for a living. The main conclusions of this article ring true as do the suggested causes, although the authors sidestepped one major issue.
In academia, grant funding is highly competitive. To secure funding you must be established in the narrow area of research where you’re requesting funding. Your research proposal must be almost guaranteed to yield meaningful results; in essence, you only get funded to answer the next obvious question in a well-characterized area of research. There is also a strong incentive to break results down into “minimum publishable units” to pad your CV. Hence, the scientific literature is bulging with trivial papers.
In the biopharma industry, the major companies have been cutting their own research for decades because their huge, internal bureaucracy stifles innovation. They grow by acquiring smaller companies and those smaller companies de-risk their research programs so far as possible.
Academic and commercial research are now both big businesses, and big business hates risk.
The issue the paper’s authors dance around is censorship. Even before our current age of cancel culture, anyone proposing a radical new scientific theory had better have a stable job, such as academic tenure. Of course, a radical new theory must be rigorously tested, but even if the theory is correct you’re challenging a status quo on which powerful people have built their careers. Science is inherently conservative.
In our modern age of cancel culture, and the progressivism of journals such as Nature where the article we’re currently discussing was published, there’s an increasing list of subjects that can no longer be freely and honestly researched.
Underlying the Nature article is the fascinating question of creativity. What their data purports to show is that, even as the total amount of scientific research increased over the past fifty years or so, the absolute number of truly ground-breaking discoveries remained constant. Is there an inherent limit on originality? Is it possible to foster originality? These questions are beyond the current article but are, for me, highly important.
I have started watching a Jordan Peterson podcast – an interview with Dr Richard Lindzen, a formidable atmospheric physicist.
Right at the beginning this is discussed – less people are going into science and more into finance, universities are chock full of administrators, scientists spend huge chunks of their time on seeking grants and very importantly, there is no room for dissent, so there is no funding for challenging prevailing narratives. Well, we know that don’t we – there is only ‘one science’.
As an aside, Peterson is going after the climate ‘one science’, which is why the establishment is going after him to the degree they are right now.
The problem with Peterson is that he doesn’t understand climate science (and this is a problem for most commentators). Lindzen obvioulsy does, but AFAICT he has nailed his mast to low sensitivity, which is clearly nonsense. The idea that there is ‘one’ climate science isn’t true….there are lots of disagreements. In fact I’ve just come off a meeting where we were all disagreeing about climate model uncertainty. What ‘lay people’ think is going on in climate science is almost never what is really going on.
He is interviewing people who understand climate science or who come at it from a completely different angle. So he also e.g. interviewed a de-transitioned but has never transitioned or de-transitioned himself.
As to your assertion that there are lots of positions on climate science – well I’m pleased to hear that but they are not being published in corporate media to any extent.
It would be nice if some of you scientists would speak up and correct the popular mis-understandings. The New York Times, for instance, is full of the assumption that mankind is headed for extinction because of climate change. This is not supported by the science; yet I’ve never seen a comment (except my own) correcting this assumption.
And what is AFAICT??!
“the assumption that mankind is headed for extinction because of climate change. “
Yes, by what date? 50 years, 100, 500, 1000 or just eventually? Let’s get the science on this.
“the assumption that mankind is headed for extinction because of climate change. “
Yes, by what date? 50 years, 100, 500, 1000 or just eventually? Let’s get the science on this.
And you do? The one thing Peterson understands is that long run modelling is BS.
As a a real expert in forecasting (I have a Ph.d and 15 peer reviewed articles some in top Economics journals) and am currently working as a macro forecaster for a large corporation, I know Peterson’s right. The only reason any climate model has shown long term accuracy is because there are 100’s if not thousands of them and so by chance one will be fairly close to reality for a while. When “experts” claim we need to do X which will hurt people now (i.e. kill or ruin many poor people’s lives) but will create a long run gain, that claim is not credible. If we are stupid enough to follow that policy and it doesn’t work and harms people the proposers will not suffer any penalty.
Look at Paul Ehrlich. His ideas were completely wrong and led to forced sterilization policies and a justification for the one child policy in China. Is he a laughing stock or in prison or in any way suffered from the stupidity and maliciousness of the ideas he proposed, Has he recanted? No, he is still a tenured prof and was given a soft ball interview on the increasingly discredited mainstream media channel.
Have the proponents of the ridiculous Energiewende which increased CO2, other forms of air pollution and made Germany politically weaker, suffered any push back? I doubt it, but I don’t know German politics that well.
He is interviewing people who understand climate science or who come at it from a completely different angle. So he also e.g. interviewed a de-transitioned but has never transitioned or de-transitioned himself.
As to your assertion that there are lots of positions on climate science – well I’m pleased to hear that but they are not being published in corporate media to any extent.
It would be nice if some of you scientists would speak up and correct the popular mis-understandings. The New York Times, for instance, is full of the assumption that mankind is headed for extinction because of climate change. This is not supported by the science; yet I’ve never seen a comment (except my own) correcting this assumption.
And what is AFAICT??!
And you do? The one thing Peterson understands is that long run modelling is BS.
As a a real expert in forecasting (I have a Ph.d and 15 peer reviewed articles some in top Economics journals) and am currently working as a macro forecaster for a large corporation, I know Peterson’s right. The only reason any climate model has shown long term accuracy is because there are 100’s if not thousands of them and so by chance one will be fairly close to reality for a while. When “experts” claim we need to do X which will hurt people now (i.e. kill or ruin many poor people’s lives) but will create a long run gain, that claim is not credible. If we are stupid enough to follow that policy and it doesn’t work and harms people the proposers will not suffer any penalty.
Look at Paul Ehrlich. His ideas were completely wrong and led to forced sterilization policies and a justification for the one child policy in China. Is he a laughing stock or in prison or in any way suffered from the stupidity and maliciousness of the ideas he proposed, Has he recanted? No, he is still a tenured prof and was given a soft ball interview on the increasingly discredited mainstream media channel.
Have the proponents of the ridiculous Energiewende which increased CO2, other forms of air pollution and made Germany politically weaker, suffered any push back? I doubt it, but I don’t know German politics that well.
The problem with Peterson is that he doesn’t understand climate science (and this is a problem for most commentators). Lindzen obvioulsy does, but AFAICT he has nailed his mast to low sensitivity, which is clearly nonsense. The idea that there is ‘one’ climate science isn’t true….there are lots of disagreements. In fact I’ve just come off a meeting where we were all disagreeing about climate model uncertainty. What ‘lay people’ think is going on in climate science is almost never what is really going on.
I have started watching a Jordan Peterson podcast – an interview with Dr Richard Lindzen, a formidable atmospheric physicist.
Right at the beginning this is discussed – less people are going into science and more into finance, universities are chock full of administrators, scientists spend huge chunks of their time on seeking grants and very importantly, there is no room for dissent, so there is no funding for challenging prevailing narratives. Well, we know that don’t we – there is only ‘one science’.
As an aside, Peterson is going after the climate ‘one science’, which is why the establishment is going after him to the degree they are right now.
Huxley gave a talk on the BBC explaining why we won so many Nobel Prizes.
Rigorous selection at 11 or 13 years of age. O levels taught one to remember facts.Early specialisation at A Levels with university scholarship exams being first yeat degree standard.Being able to go up to university at 17 years of age.Degree by the age of 20 years.Doctorate at 22 years of age.Most innovative work done at end of doctorate and first postdoc.Britain has always been short of cash and Saturday morning found scientists buying their own equipment. in Tottenham Court RoadIn USA/Continent people did not obtain doctorate to mid or late 20s which meant they missed out on their most innovative phase.Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s. Newton developed calculus at university and the undertook most of his innovation from 1665 to 1666. I suggest that a small number of rigorously slected and trained people in a few universities is a better use of money than spreading it around. There are about 140 universities in the UK but only about 5 make major innovations. The university entrance /scholarship exams and Scholarship S Levels enabled 17 to18 year olds to obtain first year degree education. This meant Cambridge /Imperial three year degrees achieved the same standard as a five year German Diploma or the six year American B.Sc and M.Sc. I knew people who obtained doctorates from Imperial in two and half years and they were only twenty two years of age. Bill Penney, Rector of IC was an example.
“Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s.”
I would say that was the case in almost all fields. For some reason there’s a lot going on at that age that just doesn’t happen later.
At that young age they have a deep understanding of their subject and aren’t held back by boundaries and expectations, and are keen to make a difference with discovery. Once they’re older that all changes with family, reputation to protect and, most of all, being seen to conform.
At that young age they have a deep understanding of their subject and aren’t held back by boundaries and expectations, and are keen to make a difference with discovery. Once they’re older that all changes with family, reputation to protect and, most of all, being seen to conform.
Excellent comment.
Thank you, not mine, just trying to remember what Huxley said.
Thank you, not mine, just trying to remember what Huxley said.
@Charles Hedges: You posted this twice. Please see my comment on the second posting. JN
“Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s.”
I would say that was the case in almost all fields. For some reason there’s a lot going on at that age that just doesn’t happen later.
Excellent comment.
@Charles Hedges: You posted this twice. Please see my comment on the second posting. JN
Huxley gave a talk on the BBC explaining why we won so many Nobel Prizes.
Rigorous selection at 11 or 13 years of age. O levels taught one to remember facts.Early specialisation at A Levels with university scholarship exams being first yeat degree standard.Being able to go up to university at 17 years of age.Degree by the age of 20 years.Doctorate at 22 years of age.Most innovative work done at end of doctorate and first postdoc.Britain has always been short of cash and Saturday morning found scientists buying their own equipment. in Tottenham Court RoadIn USA/Continent people did not obtain doctorate to mid or late 20s which meant they missed out on their most innovative phase.Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s. Newton developed calculus at university and the undertook most of his innovation from 1665 to 1666. I suggest that a small number of rigorously slected and trained people in a few universities is a better use of money than spreading it around. There are about 140 universities in the UK but only about 5 make major innovations. The university entrance /scholarship exams and Scholarship S Levels enabled 17 to18 year olds to obtain first year degree education. This meant Cambridge /Imperial three year degrees achieved the same standard as a five year German Diploma or the six year American B.Sc and M.Sc. I knew people who obtained doctorates from Imperial in two and half years and they were only twenty two years of age. Bill Penney, Rector of IC was an example.
I suspect the difference is between “science” and “Science”.
The former is a decentralized process where researchers collaborate and compete to make new discoveries. The latter is a centralized and bureaucratic process where tenured academics collaborate and compete for government grant money that will pay their salaries.
Both systems work as designed. We just fail to recognize which one we’re funding.
I suspect the difference is between “science” and “Science”.
The former is a decentralized process where researchers collaborate and compete to make new discoveries. The latter is a centralized and bureaucratic process where tenured academics collaborate and compete for government grant money that will pay their salaries.
Both systems work as designed. We just fail to recognize which one we’re funding.
Interesting article. I’m working on a book about how to increase innovation in the carmaking industry, and the problem of slowing innovation (not just in carmaking but all across the board) is one of the topics I cover.
(Or rather, I should be working on the book, but I am instead wasting all my time writing comments like this one. As per Peter Cook: “Two men are talking in a pub. One said, ‘I’m writing a book’. The other said, ‘Neither am I’.”)
I think we make fewer scientific discoveries these days for two reasons:
First reason — we’ve solved the easier problems. What’s left are problems that are more and more complex, and humans have a tough time with complexity. Science is reductionist, yet complex systems require a holistic approach.
One example is the obesity problem. Many people tell me that the answer is simple: fat people need to eat less and exercise more. But when you do the science, you find that it’s not that simple.
The main trouble is, as the New York Times recently reported, “Scientists Don’t Agree on What Causes Obesity, but They Know What Doesn’t”. All hypothesis have been proven wrong, and no one has any good ones left to test out in the real world. We’re baffled.
One thing that would help is to abandon intuition and keep a mind more open to maverick ideas. And to be more creative both in our hypotheses and in how we test those hypotheses. And to apply the newish scientific tool of causal inference to complex systems where effects have multiple causes that are difficult to identify and untangle (as with obesity).
Second reason — we penalize failure when we should encourage it. As Nicholas Taleb said about capitalism, the essence of science is encouraging failure, not rewarding success. We succeed in science through failure. We need new ideas to percolate from the bottom up, not try to force them from the top down.
We don’t really want just failure, of course, so there’s a lot more to this idea than these simple platitudes. But we do need more scientists working on the problems we face who can fail and still survive, not just a few well-funded scientists at the top of their fields who cannot afford to fail and so don’t try anything risky.
In the carmaking industry, for example, from 1900 to 1920 more than 2,000 companies built cars in the US. Only a handful of these carmakers survived, but still innovation exploded. Carmaking has been an oligarchic industry since then dominated by giants who are too big to let fail, and innovation has suffered for it.
Elon Musk is a poster boy for the right type of attitude, as he’s not afraid to try things that might fail. He knows you learn best by trial and error, and he put $44 billion at risk by buying Twitter to see if he could make it better. He’s not playing it safe. As he said a couple of months ago, “Please note that Twitter will do lots of dumb things in coming months. We will keep what works & change what doesn’t.”
Jeff Bezos is another poster boy, saying “Amazon was built on failure” and “We need big failures if we are going to move the needle — billion-dollar scale failures. And if we’re not [failing at that scale], we’re not swinging hard enough.”
Add investor and economist Bill Janeway as a third poster boy, who said, “The process of innovation is trial and error. And error. And error.”
Discoveries in science are not the same as innovations in industry, so the cures for the slowing pace of discovery and innovation may not be identical but I think they at least will rhyme. My analysis in the book will cover a lot more ground in a lot more depth, including how to simplify complexity, but these are some of the ideas I will explore in it.
Good luck with the book. If it’s as good as that short summary, it’ll be worth it – even if it fails at first!
“we penalize failure when we should encourage it.”
Worse still, in schools we teach that there is no such thing as failure.
Good luck with the book. If it’s as good as that short summary, it’ll be worth it – even if it fails at first!
“we penalize failure when we should encourage it.”
Worse still, in schools we teach that there is no such thing as failure.
Interesting article. I’m working on a book about how to increase innovation in the carmaking industry, and the problem of slowing innovation (not just in carmaking but all across the board) is one of the topics I cover.
(Or rather, I should be working on the book, but I am instead wasting all my time writing comments like this one. As per Peter Cook: “Two men are talking in a pub. One said, ‘I’m writing a book’. The other said, ‘Neither am I’.”)
I think we make fewer scientific discoveries these days for two reasons:
First reason — we’ve solved the easier problems. What’s left are problems that are more and more complex, and humans have a tough time with complexity. Science is reductionist, yet complex systems require a holistic approach.
One example is the obesity problem. Many people tell me that the answer is simple: fat people need to eat less and exercise more. But when you do the science, you find that it’s not that simple.
The main trouble is, as the New York Times recently reported, “Scientists Don’t Agree on What Causes Obesity, but They Know What Doesn’t”. All hypothesis have been proven wrong, and no one has any good ones left to test out in the real world. We’re baffled.
One thing that would help is to abandon intuition and keep a mind more open to maverick ideas. And to be more creative both in our hypotheses and in how we test those hypotheses. And to apply the newish scientific tool of causal inference to complex systems where effects have multiple causes that are difficult to identify and untangle (as with obesity).
Second reason — we penalize failure when we should encourage it. As Nicholas Taleb said about capitalism, the essence of science is encouraging failure, not rewarding success. We succeed in science through failure. We need new ideas to percolate from the bottom up, not try to force them from the top down.
We don’t really want just failure, of course, so there’s a lot more to this idea than these simple platitudes. But we do need more scientists working on the problems we face who can fail and still survive, not just a few well-funded scientists at the top of their fields who cannot afford to fail and so don’t try anything risky.
In the carmaking industry, for example, from 1900 to 1920 more than 2,000 companies built cars in the US. Only a handful of these carmakers survived, but still innovation exploded. Carmaking has been an oligarchic industry since then dominated by giants who are too big to let fail, and innovation has suffered for it.
Elon Musk is a poster boy for the right type of attitude, as he’s not afraid to try things that might fail. He knows you learn best by trial and error, and he put $44 billion at risk by buying Twitter to see if he could make it better. He’s not playing it safe. As he said a couple of months ago, “Please note that Twitter will do lots of dumb things in coming months. We will keep what works & change what doesn’t.”
Jeff Bezos is another poster boy, saying “Amazon was built on failure” and “We need big failures if we are going to move the needle — billion-dollar scale failures. And if we’re not [failing at that scale], we’re not swinging hard enough.”
Add investor and economist Bill Janeway as a third poster boy, who said, “The process of innovation is trial and error. And error. And error.”
Discoveries in science are not the same as innovations in industry, so the cures for the slowing pace of discovery and innovation may not be identical but I think they at least will rhyme. My analysis in the book will cover a lot more ground in a lot more depth, including how to simplify complexity, but these are some of the ideas I will explore in it.
“How, then, to combat this stasis, and guarantee more significant breakthroughs in medicine, engineering and climate science?”
The answer is in the final two words of the question, and it’s that climate science is regarded as of equal importance to the general priorities of medicine and engineering. The fact that this perception exists at all explains quite a lot about why scientific research seems to slowing the pace of innovation. Huge amounts of damage have been done in the past 30 years by the colossal overreach of political influence into scientific communities, caused by the desire of a globalising political establishment to harness the authority of science into supporting the political aim of making climate change a priority over economic growth and democratic choice.
I’ll leave it there on the matter of climate change, because there are other reasons why science may be failing to spur innovation. One of the main things to appreciate here is that the classic perception of how science provokes innovation is wrong. It is not the case that scientists discover new phenomena, explain them and then engineers make products and services out of them. That does of course happen sometimes, but more often it’s something like this: a commercial research laboratory is doing one thing, and as part of it one day a technician observes something odd which he can’t explain: the “that’s funny” moment where something entirely unexpected turns up. It is then researched itself, and the process of finding applications for its use is already well underway by the time the science community has got to the point of explaining what it actually is.
What has changed in recent years is that the generalised bureaucratic nature of research funding has created an innovation infrastructure that is far less efficient and flexible when it comes to capturing such events and harnessing them to useful commercial innovation. It is this which needs to change.
“How, then, to combat this stasis, and guarantee more significant breakthroughs in medicine, engineering and climate science?”
The answer is in the final two words of the question, and it’s that climate science is regarded as of equal importance to the general priorities of medicine and engineering. The fact that this perception exists at all explains quite a lot about why scientific research seems to slowing the pace of innovation. Huge amounts of damage have been done in the past 30 years by the colossal overreach of political influence into scientific communities, caused by the desire of a globalising political establishment to harness the authority of science into supporting the political aim of making climate change a priority over economic growth and democratic choice.
I’ll leave it there on the matter of climate change, because there are other reasons why science may be failing to spur innovation. One of the main things to appreciate here is that the classic perception of how science provokes innovation is wrong. It is not the case that scientists discover new phenomena, explain them and then engineers make products and services out of them. That does of course happen sometimes, but more often it’s something like this: a commercial research laboratory is doing one thing, and as part of it one day a technician observes something odd which he can’t explain: the “that’s funny” moment where something entirely unexpected turns up. It is then researched itself, and the process of finding applications for its use is already well underway by the time the science community has got to the point of explaining what it actually is.
What has changed in recent years is that the generalised bureaucratic nature of research funding has created an innovation infrastructure that is far less efficient and flexible when it comes to capturing such events and harnessing them to useful commercial innovation. It is this which needs to change.
I’m curious as to whether the rise of women in the workplace has also stifled creativity in the sciences. I’m being completely anecdotal here, but over the years I’ve noticed an alarming trend among women, particularly younger ones, in that they seem to be shutting down any thoughts or opinions that don’t gel with their own notions of fairness or safetyism. Or could it be that creativity and innovation are also linked to men’s primal sexual urges (as a healthy alternative to channeling these urges) and political correctness is slowly killing it off?
This article in Nature magazine kind of describes this phenomena albeit in the form of gender equity in research funding:
‘Game-changing’ gender quotas introduced by Australian research agency
It’s possible. The whole idea of fairness and safetyism is likely to suffocate anything challenging and consequently threatening. A no risk society is not going to be very dynamic. My personal feeling about art for instance is that women, considering the numbers in the arts (the visual arts more specifically for me) have produced capable work but very little that’s outstanding. That may seem an unfair thing to say, but then that very idea stifles ideas. It was widely held that women were stifled because of the domination of men, but I think women, by know have had enough opportunity to show their colours. The levelling out of the playing field, not just between men and women, has created mediocrity. And I have to say that even the work produced by men is no longer what it was.
Yup, men are more extreme in their thinking and as a consequence more radical and innovative.
Yup, men are more extreme in their thinking and as a consequence more radical and innovative.
It’s possible. The whole idea of fairness and safetyism is likely to suffocate anything challenging and consequently threatening. A no risk society is not going to be very dynamic. My personal feeling about art for instance is that women, considering the numbers in the arts (the visual arts more specifically for me) have produced capable work but very little that’s outstanding. That may seem an unfair thing to say, but then that very idea stifles ideas. It was widely held that women were stifled because of the domination of men, but I think women, by know have had enough opportunity to show their colours. The levelling out of the playing field, not just between men and women, has created mediocrity. And I have to say that even the work produced by men is no longer what it was.
I’m curious as to whether the rise of women in the workplace has also stifled creativity in the sciences. I’m being completely anecdotal here, but over the years I’ve noticed an alarming trend among women, particularly younger ones, in that they seem to be shutting down any thoughts or opinions that don’t gel with their own notions of fairness or safetyism. Or could it be that creativity and innovation are also linked to men’s primal sexual urges (as a healthy alternative to channeling these urges) and political correctness is slowly killing it off?
This article in Nature magazine kind of describes this phenomena albeit in the form of gender equity in research funding:
‘Game-changing’ gender quotas introduced by Australian research agency
It’s difficult to draw any definitive conclusions. The study may be looking at “pure” science, but just look at the science and innovation involved in – to name but three fields – our view of the universe via the Hubble and now the James Webb telescopes; the attempts to crack fusion power, which are just starting to bear the seeds of future fruit; the work going on at the Large Hadron Collider.
In the case of the latter, the discovery of the Higgs Bosun in 2013 hasn’t yet been followed up by similarly ground-breaking science, but it’s not for the want of trying. Indeed, a future breakthrough in physics which might lead towards the vaunted unified theory of combining General Relativity with Quantum Mechanics might only happen once further discoveries are obtained. The “magical thinking” with which Einstein and the eminent particle physicists engaged during the first half of the 20th century may well have reached it’s limit without further evidence. Each field has indeed become specialised and fragmented.
The same applies in medicine, for instance. All hospital doctors are now highly specialised in a particular field – necessarily so; there’s just too much to keep up with to have an overarching view, even if such a thing had a practical value. I guess one of the main issues is funding of research. Only those areas likely to bear some return are likely to attract funding, and that’s before we even think about engaging in the “correct” type of research.
It’s difficult to draw any definitive conclusions. The study may be looking at “pure” science, but just look at the science and innovation involved in – to name but three fields – our view of the universe via the Hubble and now the James Webb telescopes; the attempts to crack fusion power, which are just starting to bear the seeds of future fruit; the work going on at the Large Hadron Collider.
In the case of the latter, the discovery of the Higgs Bosun in 2013 hasn’t yet been followed up by similarly ground-breaking science, but it’s not for the want of trying. Indeed, a future breakthrough in physics which might lead towards the vaunted unified theory of combining General Relativity with Quantum Mechanics might only happen once further discoveries are obtained. The “magical thinking” with which Einstein and the eminent particle physicists engaged during the first half of the 20th century may well have reached it’s limit without further evidence. Each field has indeed become specialised and fragmented.
The same applies in medicine, for instance. All hospital doctors are now highly specialised in a particular field – necessarily so; there’s just too much to keep up with to have an overarching view, even if such a thing had a practical value. I guess one of the main issues is funding of research. Only those areas likely to bear some return are likely to attract funding, and that’s before we even think about engaging in the “correct” type of research.
Maybe this is a reflection of the bureaucratization of universities and research facilities.
@Jim Veenbas.
Jim, you are right about that. There is a sort of unholy alliance between bureaucracy and woke activism – the bureaucrats encourage the woke activists in order to undermine those academics who encourage genuine critical thinking. This has also made it easier for those who are not capable of thinking for themselves to climb the greasy pole.
@Jim Veenbas.
Jim, you are right about that. There is a sort of unholy alliance between bureaucracy and woke activism – the bureaucrats encourage the woke activists in order to undermine those academics who encourage genuine critical thinking. This has also made it easier for those who are not capable of thinking for themselves to climb the greasy pole.
Maybe this is a reflection of the bureaucratization of universities and research facilities.
Any research proposal that might cast doubt on the Holy Writ of climate change simply won’t get funding. It’s as simple as that.
Any research proposal that might cast doubt on the Holy Writ of climate change simply won’t get funding. It’s as simple as that.
In the 1940s G M Trevelyan was concerned that the increase in higher education would produce an intellectual proletariat; was he correct?
I wokism a function of creating a middle class intellectual proletariat which is a blob, lacking innovation and fortitude ?
Yes, absolutely. But then it’s not really an edjakashion.
Yes, absolutely. But then it’s not really an edjakashion.
In the 1940s G M Trevelyan was concerned that the increase in higher education would produce an intellectual proletariat; was he correct?
I wokism a function of creating a middle class intellectual proletariat which is a blob, lacking innovation and fortitude ?
Park et al have produced a very important paper IMHO. Let us hope it’s recognised across the sciences. For a complementary angle see a paper recently published by Paul Smaldino and Caitlin O’Connor, which provides evidence for the importance of interdisciplinarity in promoting greater critical awareness of research methods. It’s in a new journal, Collective Intelligence (could THAT tell us something?). O’Connor is an innovator in using Agent-based Modelling in Philosophy of Science, and Smaldino himself has made contributions across a very wide range of disciplines. If he could be said to specialise, it is in the field of Complexity.
Here’s the reference:
Smaldino, P. E., & O’Connor, C. (2022). Interdisciplinarity can aid the spread of better methods between scientific communities. Collective Intelligence, 1(2), 26339137221131816.
Park et al have produced a very important paper IMHO. Let us hope it’s recognised across the sciences. For a complementary angle see a paper recently published by Paul Smaldino and Caitlin O’Connor, which provides evidence for the importance of interdisciplinarity in promoting greater critical awareness of research methods. It’s in a new journal, Collective Intelligence (could THAT tell us something?). O’Connor is an innovator in using Agent-based Modelling in Philosophy of Science, and Smaldino himself has made contributions across a very wide range of disciplines. If he could be said to specialise, it is in the field of Complexity.
Here’s the reference:
Smaldino, P. E., & O’Connor, C. (2022). Interdisciplinarity can aid the spread of better methods between scientific communities. Collective Intelligence, 1(2), 26339137221131816.
Have looked through the paper and I only find reference to American science. Do the findings include, say, China? I think not. There is a tendency in the USA to believe that there is only one country in the world.
I don’t think they limited themselves to American science. Examples of publications included in their review are Science, Nature, and Proceedings of the National Academy of Sciences (USA). All three are prestigious journals (Nature is UK-based) that publish results from around the world. They also reviewed the USPTO (US patent office) public databases. Again, companies from around the world seek US patents.
It is certainly true that the total global number of patent applications, previously dominated by Western nations and institutions, is increasingly populated by the eastern bloc. What we are not seeing though is any evidence that global-level innovation is emerging from this.
I don’t think they limited themselves to American science. Examples of publications included in their review are Science, Nature, and Proceedings of the National Academy of Sciences (USA). All three are prestigious journals (Nature is UK-based) that publish results from around the world. They also reviewed the USPTO (US patent office) public databases. Again, companies from around the world seek US patents.
It is certainly true that the total global number of patent applications, previously dominated by Western nations and institutions, is increasingly populated by the eastern bloc. What we are not seeing though is any evidence that global-level innovation is emerging from this.
Have looked through the paper and I only find reference to American science. Do the findings include, say, China? I think not. There is a tendency in the USA to believe that there is only one country in the world.
Possibly over specialization is a problem, There’s an old joke that an expert is someone who knows more and more about less and less until he knows everything about nothing.
A lot of ideas come from cross fertilization between different sciences and there was a breed a scientist whose knowledge did cross borders. I think Richard Feynman was one. I was impressed reading books about him that he was interested in everything, including safe cracking and bongo playing
Possibly over specialization is a problem, There’s an old joke that an expert is someone who knows more and more about less and less until he knows everything about nothing.
A lot of ideas come from cross fertilization between different sciences and there was a breed a scientist whose knowledge did cross borders. I think Richard Feynman was one. I was impressed reading books about him that he was interested in everything, including safe cracking and bongo playing
Huxley gave a talk on the BBC explaining why we won so many Nobel Prizes.
Rigorous selection at 11 or 13 years of age. O levels taught one to remember facts. Early specialisation at A Levels with university scholarship exams being first year degree standard. Being able to go up to university at 17 years of age. Degree by the age of 20 years. Doctorate at 22 years of age. Most innovative work done at end of doctorate and first postdoc. Britain has always been short of cash and Saturday morning found scientists buying their own equipment in Tottenham Court Road. In USA/Continent people did not obtain doctorate to mid or late 20s which meant they missed out on their most innovative phase. Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s. Newton developed calculus at university and the undertook most of his innovation from 1665 to 1666. I suggest that a small number of rigorously selected and trained people in a few universities is a better use of money than spreading it around. There are about 140 universities in the UK but only about 5 make major innovations. The university entrance /scholarship exams and Scholarship S Levels enabled 17 to18 year olds to obtain first year degree education. This meant Cambridge /Imperial three year degrees achieved the same standard as a five year German Diploma or the six year American B.Sc and M.Sc. I knew people who obtained doctorates from Imperial in two and half years and they were only twenty two years of age. Bill Penney, Rector of IC was an example.
Which Huxley was this? I’m assuming it’s Sir Julian Huxley but it would be nice to have clarity on this – would provide more context!
Andrew Huxley who won Nobel in 1963. In Britain we used to have a rigorous academic system where people studied French, Latin, Greek, Maths, Science, Divinity, History and Geography up to Ordinary Leaving Cert/ O Level which gave breadth. The Greek, Latin and French was probably A Level standard of today. Also many scientists served on war work which gave them extra experience.
The advantage was that public and grammar schools used to to have after school clubs in maths and science which enabled pupils to broaden their knowledge beyond the curriculum.
It would be interesting to compare the knowledge required by someone to win a scholarship to Trinity, Cambridge in say 1982 and A Levels of today, especially in maths.
I can partially answer this question, though I only got an Exhibition (junior Scholarship) to a different Cambridge college in the 1960s. I finished up teaching maths there and marking scholarship exams. I suspect that the best current A level students *know* more than Scholarship students 50 years but they are less good at thinking for themselves. I received almost no teaching in advanced mathematics at school (a small grammar school) because there was no one else doing the subject or, really, capable of teaching it. So I was self-taught and learned by solving problems from textbooks.
While the teaching was much better at Cambridge, it was very much sink or swim. By the end of a 3 year Cambridge degree the best students were certainly at the level of US students after 1 or 2 years of MSc study – but again by learning for themselves rather than directed intensive teaching.
It is this difference that was critical – the focus on problem solving, self-teaching and thinking for oneself rather than mastering knowledge in the form of a large body of “established” methods and ideas. In some respects we were quite ignorant and “unprofessional” with no inclination to cite dozens of papers. On the other hand we were less bound by conventional ideas and more willing to think across what were well-established boundaries in the US and Europe.
This is swings and roundabouts. Cambridge had a stellar record of producing Nobel-quality work in new fields. It was less good at encouraging and training the large number of diligent researchers who contribute most of the output in scientific disciplines. That is the trade-off today – do we focus on brilliant individuals or the much larger number of competent but relatively un-innovative researchers?
Good points. A friend who wentp to to Cambridge in the 1950s attended King Edward VIth Grmmar School in Birmingham where he was taught maths by a Cambridge Wrangler. By the age of 17 he had achieved first year degree standard maths and science.
AnthonySampson in His Anatomy of Britain, 1965 edition pointed out 16% of Oxbridge came from 2% of the population which attended Direct Grant Grammar Schools such as Manchester , ing,King Edward VIth. The book has table for the late 1950s early 1960s which has as percentage of no of scholarships won to Oxbridge over 5 years divided by number in VIth Form. excluding Winchester an Westminster which had closed scholarships. Dulwich was highest top with most of the top 20 being Direct Grant Grammr ot those with large VIth forms , mostly grammar. From 1944 to 1975 , Dulwich functioned as DGG.
I doubt modern students know more. Calculus is no longer in O Levels and Maclaurin/Taylor Series are are not in A Levels. Cambridge – Pre U was an an attempt to re-introduce the rigour of Oxbridge Entrance /Scholarship Exdams. When people sat separate Pure and Applied Maths A Levels and then took S Levels papers( for top 15% of A ) that was first year degree standard maths.
I knew someone who won a scholarship to Imperial who taught second year maths at an American university, before going up to university.
I suggest we are realistic about innovation. It is only achieved by very few but it is vital. The Industrial Revolution, starting with Newton in 1666 was produced by 50 to 100 people over 250 years. Newton, Hooke, Newcomen, Brindley, Darby, Watt, Boulton gives us the maths and physics, canals to produce cheap coal: converting coal to coke, steam engines and precision cutting of steel. Stephensons, Robert an George give us railways. Wedgewood, Arkwright and , Boulton gives us large scale factory production of ceramics, cotton and wool.
Academic ability is not innovation, it can just produce performing seals. However, without academic ability there can be no innovation. Innovation is the Divine Spark, hence Michelangelo was called The Divine One.
The problem is when academically bright people go to university and find they are not innovative, they invariably become resntful, embittered and spiteful. Both F Whittle and B Wallis have spoken about how people with doctorates tried to block their innovations. As B Wallis said ” Everything I have achieved has been in despite of experts and not because of them”.
Good points. A friend who wentp to to Cambridge in the 1950s attended King Edward VIth Grmmar School in Birmingham where he was taught maths by a Cambridge Wrangler. By the age of 17 he had achieved first year degree standard maths and science.
AnthonySampson in His Anatomy of Britain, 1965 edition pointed out 16% of Oxbridge came from 2% of the population which attended Direct Grant Grammar Schools such as Manchester , ing,King Edward VIth. The book has table for the late 1950s early 1960s which has as percentage of no of scholarships won to Oxbridge over 5 years divided by number in VIth Form. excluding Winchester an Westminster which had closed scholarships. Dulwich was highest top with most of the top 20 being Direct Grant Grammr ot those with large VIth forms , mostly grammar. From 1944 to 1975 , Dulwich functioned as DGG.
I doubt modern students know more. Calculus is no longer in O Levels and Maclaurin/Taylor Series are are not in A Levels. Cambridge – Pre U was an an attempt to re-introduce the rigour of Oxbridge Entrance /Scholarship Exdams. When people sat separate Pure and Applied Maths A Levels and then took S Levels papers( for top 15% of A ) that was first year degree standard maths.
I knew someone who won a scholarship to Imperial who taught second year maths at an American university, before going up to university.
I suggest we are realistic about innovation. It is only achieved by very few but it is vital. The Industrial Revolution, starting with Newton in 1666 was produced by 50 to 100 people over 250 years. Newton, Hooke, Newcomen, Brindley, Darby, Watt, Boulton gives us the maths and physics, canals to produce cheap coal: converting coal to coke, steam engines and precision cutting of steel. Stephensons, Robert an George give us railways. Wedgewood, Arkwright and , Boulton gives us large scale factory production of ceramics, cotton and wool.
Academic ability is not innovation, it can just produce performing seals. However, without academic ability there can be no innovation. Innovation is the Divine Spark, hence Michelangelo was called The Divine One.
The problem is when academically bright people go to university and find they are not innovative, they invariably become resntful, embittered and spiteful. Both F Whittle and B Wallis have spoken about how people with doctorates tried to block their innovations. As B Wallis said ” Everything I have achieved has been in despite of experts and not because of them”.
I can partially answer this question, though I only got an Exhibition (junior Scholarship) to a different Cambridge college in the 1960s. I finished up teaching maths there and marking scholarship exams. I suspect that the best current A level students *know* more than Scholarship students 50 years but they are less good at thinking for themselves. I received almost no teaching in advanced mathematics at school (a small grammar school) because there was no one else doing the subject or, really, capable of teaching it. So I was self-taught and learned by solving problems from textbooks.
While the teaching was much better at Cambridge, it was very much sink or swim. By the end of a 3 year Cambridge degree the best students were certainly at the level of US students after 1 or 2 years of MSc study – but again by learning for themselves rather than directed intensive teaching.
It is this difference that was critical – the focus on problem solving, self-teaching and thinking for oneself rather than mastering knowledge in the form of a large body of “established” methods and ideas. In some respects we were quite ignorant and “unprofessional” with no inclination to cite dozens of papers. On the other hand we were less bound by conventional ideas and more willing to think across what were well-established boundaries in the US and Europe.
This is swings and roundabouts. Cambridge had a stellar record of producing Nobel-quality work in new fields. It was less good at encouraging and training the large number of diligent researchers who contribute most of the output in scientific disciplines. That is the trade-off today – do we focus on brilliant individuals or the much larger number of competent but relatively un-innovative researchers?
Andrew Huxley who won Nobel in 1963. In Britain we used to have a rigorous academic system where people studied French, Latin, Greek, Maths, Science, Divinity, History and Geography up to Ordinary Leaving Cert/ O Level which gave breadth. The Greek, Latin and French was probably A Level standard of today. Also many scientists served on war work which gave them extra experience.
The advantage was that public and grammar schools used to to have after school clubs in maths and science which enabled pupils to broaden their knowledge beyond the curriculum.
It would be interesting to compare the knowledge required by someone to win a scholarship to Trinity, Cambridge in say 1982 and A Levels of today, especially in maths.
Which Huxley was this? I’m assuming it’s Sir Julian Huxley but it would be nice to have clarity on this – would provide more context!
Huxley gave a talk on the BBC explaining why we won so many Nobel Prizes.
Rigorous selection at 11 or 13 years of age. O levels taught one to remember facts. Early specialisation at A Levels with university scholarship exams being first year degree standard. Being able to go up to university at 17 years of age. Degree by the age of 20 years. Doctorate at 22 years of age. Most innovative work done at end of doctorate and first postdoc. Britain has always been short of cash and Saturday morning found scientists buying their own equipment in Tottenham Court Road. In USA/Continent people did not obtain doctorate to mid or late 20s which meant they missed out on their most innovative phase. Most scientists, Newton and Einstein for example, achieve most of their innovation in their early 20s. Newton developed calculus at university and the undertook most of his innovation from 1665 to 1666. I suggest that a small number of rigorously selected and trained people in a few universities is a better use of money than spreading it around. There are about 140 universities in the UK but only about 5 make major innovations. The university entrance /scholarship exams and Scholarship S Levels enabled 17 to18 year olds to obtain first year degree education. This meant Cambridge /Imperial three year degrees achieved the same standard as a five year German Diploma or the six year American B.Sc and M.Sc. I knew people who obtained doctorates from Imperial in two and half years and they were only twenty two years of age. Bill Penney, Rector of IC was an example.
Wouldn’t it be fairer to say that the more is known, the harder it is to do new as opposed to incremental things. i.e. the previously abundant low hanging fruit has now largely been picked, and truly new things require disproportionately more effort to discover.
But then again, shouldn’t the accumulation of knowledge help us penetrate the next mystery? Or is that the problem?; those very techniques and successes have created a particular way of addressing things that act as blinkers. We actually need to break out of the patterned thinking, even though it led to so many successes.
But then again, shouldn’t the accumulation of knowledge help us penetrate the next mystery? Or is that the problem?; those very techniques and successes have created a particular way of addressing things that act as blinkers. We actually need to break out of the patterned thinking, even though it led to so many successes.
Wouldn’t it be fairer to say that the more is known, the harder it is to do new as opposed to incremental things. i.e. the previously abundant low hanging fruit has now largely been picked, and truly new things require disproportionately more effort to discover.
Does scientists need a private income, to be innovative :Nnewton, family farm; Darwin, wife’s inheritce, Mendel, an abbot and Einstein, patent clerk.
Does scientists need a private income, to be innovative :Nnewton, family farm; Darwin, wife’s inheritce, Mendel, an abbot and Einstein, patent clerk.
What about my discovery in the theory of our social system of macroeconomics? Is this not a new science? It has now become a true science, to thankfully replace the pseudo one that experts have always wrongly thought was what it is! As an idealist who believes in sharing good scientific knowledge and its better understanding I offer my book about it for free. Write to me at [email protected] and I will happily send you an e-copy of my 310-page book “Consequential Macroeconomics”. It will allow you to think straight and uses cold logic to eliminate what was vague and complex in your mind on this past intuitive and biased subject.