Leonardo da Vinci painted the original polymath Aristotle. Credit: Godong/Universal Images Group via Getty Images


May 22, 2020   5 mins

Becoming a polymath used to be fairly straightforward. First, study hard, and ideally start early: Blaise Pascal was thinking up ground-breaking mathematical theories in his teens — useful preparation for inventing existentialism and the calculator. Second, find a patron: Leonardo da Vinci’s career as artist, inventor and scientist was supported by his work as Cesare Borgia’s chief engineer. Third, try and combine your interests: the Welshman William Jones, a pioneering scholar of law and languages, found his niche by accepting a position on the Supreme Court in Bengal.

Today the barriers to such a career are formidable. Learning has retreated to the universities, which demand specialisation. It’s hard to imagine a contemporary chemistry professor informing his employer (as Michael Polanyi did at Manchester in 1948) that he would like to switch to philosophy. And of course, any serious intellectual work requires the increasingly rare skill of not looking at your phone every ten minutes. As Peter Burke writes ominously in his timely new book The Polymath: A Cultural History from Leonardo da Vinci to Susan Sontag, present-day polymaths (Jared Diamond, Raymond Tallis) tend to be “scholars who were already middle-aged before the digital revolution occurred.”

Who cares, you may ask. It may be very impressive to be able to design a bridge in 27 languages, but is it any more useful to the world than juggling flaming torches on a unicycle? And given that one man’s polymath is another man’s charlatan, should we really be alarmed if the species is endangered? Burke’s book makes a strong case that we should be.

He does this by including not just the obvious superminds like Athanasius Kircher (a ludicrously wide-ranging scientific researcher who also wrote an encyclopedia of China and kickstarted the study of Egyptology) but also “polymaths of the second rank”. That means humanistic scholars such as René Girard, who applied his theory of desire first to literature, then to anthropology and eventually to religious history; geniuses with an extra string to their bow, like Nabokov with his dabblings in the study of butterflies; and even popular writers like Macaulay and Voltaire, just because they had such diverse interests.

This does broaden the meaning of the word almost to the point of meaninglessness. But it also reminds us that first-rate polymaths are not a separate species. Their strengths — curiosity, a capacity for hard work, a good memory, the ability to focus — are not unique, they just have more of them than everyone else. Few of us will have occasion to ask ourselves, as Joseph Needham did in the opening line of his autobiography, “How did it happen that a biochemist turned into a historian and sinologist?” But if we lived for a thousand years, we might have a similar story to tell.

The decline of polymathy, then, suggests a broader crisis. For Burke, it is a crisis of too much information. The seventeenth century was a “golden age of polymaths”, as explorers found new regions, the scientific method flourished, and the postal service and the proliferation of journals allowed scholars to trade ideas. But those same forces led to “information overload”.

Over the next 200 years, the intellectual world divided between the specialists who knew a lot about their little area, and popularisers who knew a little about a lot. Institutions, as well as individuals, had to go their separate ways: in the 1880s the Natural History Museum split off from the British Museum, and the Science Museum from what is now the V&A. The twentieth century saw some conscious efforts to foster “interdisciplinarity”, but the fragmentation of knowledge only accelerated — even before the internet came along.

This is Burke’s version of events, and it is obviously a large part of the story. But there is surely another reason for the decline of the polymath: namely, the intellectual revolution of the sixteenth and seventeenth centuries, when as John Donne put it in 1611, “new philosophy calls all in doubt.”

That new philosophy claimed, like today’s political leaders, that it was merely following the science: instead of theorising about the celestial spheres, just look through a telescope! But there was a sinister undercurrent, as Donne realised: the new philosophers sometimes seemed to imply that, if you did follow the science, you might well find a cold, dead universe in which our beliefs about the beauty, harmony and meaning of the world around us would be exposed as delusions.

When Dante gazed at the night sky, he saw “the love which moves the sun and other stars”; 350 years later, Pascal looked at the same thing and recorded that “the eternal silence of these infinite spaces terrifies me.” The contemplation of the world, it appeared, might not lead us to sublime truths, but to disenchantment.

Before this revolution, as Burke observes of the Middle Ages, “Wide-ranging curiosity was normal … and might even be described as the default setting.” I suspect that pre-modern all-rounders — Aristotle, Shen Gua, Avicenna, Hildegard of Bingen — took for granted that all their different studies had an essential unity. Combinations which we think of as quirky — such as Roger Bacon’s moving between astronomy, theology, optics and linguistics — came naturally. Polymaths could assume that very different kinds of intellectual work were all approaches to the truth.

But over the last 400 years, we have had to deal with an underlying anxiety: what if only one kind of study — measuring things and making mathematical laws from the results — really gets you to rock-bottom reality? If that’s correct, then a beautiful piece of music, ultimately, is just sound waves hitting your ears. Love, truth and goodness are just your neurons firing. As for philosophy — well, according to Stephen Hawking, “philosophy is dead”, and as Richard Dawkins once memorably asked, “What did Plato say that was actually right?”

This nagging anxiety threatens polymaths more than anyone: if the scientific method, and only the scientific method, reveals the ultimate truth of things, then philosophy, art, music, history, philology and so on are, at best, interesting digressions from the real work. As George Steiner – who died in February, just a few weeks before his fellow polymath Roger Scruton – put it: “Many of the traditional humanistic disciplines have shown a deep malaise, a nervous, complex recognition of the exactions and triumphs of mathematics and the natural sciences.” Steiner’s response was to defend language, “the word”, as something which can also reveal the truth.

Over the last century, other polymaths have tried to show that the scientific method isn’t everything. Michael Polanyi, himself a distinguished scientist, defended “tacit knowledge” — those things we know even if they can’t be written down as laws. Raymond Tallis, a clinical neuroscientist among many other things, has written incisively about why neuroscience isn’t enough. Scruton wrestled with the same question, though he often sounded rather glum about it.

In bridging the gap between science and everything else, our best hope is a school of philosophy which has gained momentum in recent years. It suggests that many of the old philosophical assumptions, the ones which were overturned four centuries ago, are actually the only sure foundation for modern science. And the key, according to this new school, is a return to Aristotle. So if the twenty-first century sees the rebirth of the polymath, we may be able to thank the father of the whole tribe.


Dan Hitchens writes the newsletter ‘The Pineapple’ and is former editor of the Catholic Herald

ddhitchens