Asteroids reflect the neuroses of their time. Credit: Don't Look Up.


March 24, 2025   5 mins

During the reign of Napoleon Bonaparte, the sky began to fall. Near Normandy, several locals saw rocks smash into the ground on 26 April 1803. At the time, the concept of asteroids was viewed by the French intelligentsia as superstitious nonsense. But there was enough hullabaloo in Normandy for Napoleon’s interior minister to send Jean-Baptiste Biot to find out what had happened.

Biot was a young professor of mathematics working within the newly established scientific method. He examined local rock formations, compared them to the rocks that were claimed to have fallen from the sky, and interviewed witnesses ranging from clergy to coachmen. The new rocks were of inarguably different geological composition to the local rocks. And the locals’ stories matched up. Biot had no choice but to conclude that the elites were wrong, and that the rocks had indeed fallen from the sky.

Today we are much better informed about the threat of asteroids. Thanks to a worldwide network of astronomers and telescopes, our species can keep track of the larger asteroids that might one day collide with Earth. Smaller asteroids hit Earth all the time, but larger asteroids, of the kind that wiped out the dinosaurs or even created the Moon, are unlikely to strike us in the foreseeable future.

Nevertheless, Asteroid 2024 YR4 made headlines this year. It is 90 meters wide, and therefore big enough to destroy a city, if not a planet. Asteroids of this size hit us infrequently: roughly every 2,000 years, according to NASA statistics. Those statistics derive from the work of scientists such as Melissa Brucker, principal investigator of Spacewatch at the University of Arizona Lunar and Planetary Laboratory. Brucker’s team looks at roughly 1,300 near-Earth objects a year. Among them number 160 “potentially hazardous” objects, which, like Asteroid 2024 YR4, have a non-zero (though still very small) probability of hitting Earth. Brucker’s lab denotes risk with a scale going from zero to eight, where eight corresponds to “certain collisions”. For a brief time, YR4 was a three — midway in the “meriting attention by astronomers” zone. (Sometimes, we spot asteroids of this size only when they’ve gone past us — when it is too late for us to do anything.)

As Brucker and her fellow scientists gathered data, it temporarily appeared possible that the city-killer might hit Earth. The asteroid was much more likely to blast into the sea than into a city, and NASA has already demonstrated that it can deflect asteroids by hurling a probe into them. We were almost certainly safe from YR4, but the public was nonetheless riveted.

“It temporarily appeared possible that the city-killer might hit Earth.”

In this regard, modern humans are not unique. Biot’s youth was just about the only period of history for which we have no evidence of human preoccupation by asteroids or comets. Even the Sumerians, one of the most ancient known civilisations, left us a cuneiform clay tablet that records a meteor impact that took place 5,000 years ago. Perhaps 5,000 years from now, our descendants will dust off the VHS tape of Armageddon, a Hollywood hit of 1998. In the film, a crew led by Bruce Willis sets out to drill into the surface of an oncoming asteroid and detonate within it a nuclear bomb.

Whether they are of the past or present, asteroids reflect the neuroses of their time. To the ancients, they were portents of divine disfavour. To those of us living in the 2020s, they can articulate our other existential concerns. In Don’t Look Up, a popular Netflix release of 2022, Jennifer Lawrence and Leonardo DiCaprio play astronomers whose thwarted efforts to prompt action are redolent of the frustrations of modern climate scientists.

Humanity’s pan-historical preoccupation with asteroids can be put down, at least partially, to the obviousness of the threat. We can all get our heads around the idea of a large rock hitting the Earth very hard. Hence our fascination with asteroids at the expense of other threats. Human attention sometimes has only a very weak correspondence with danger. It’s easier to focus on the thing that is splashing across headlines than the risks we deal with every day. This is why the fear of flying is far more common than the fear of driving, even though air travel is the safest form of transportation.

Artificial Intelligence, which leaders in education and business want us to use, gives us another example of the miscalculation of risk. In this case, the risk is less salient to human psychology than perhaps it should be. A recent report on the safety of advanced AI warns of “disinformation and manipulation of public opinion” in elections. General-purpose AI could remove most of our jobs within 10 years. Given such forecasts, it is remarkable how much attention is taken up by tariffs. AI could be used to disrupt systems that are over-reliant on it, including those critical to our society like finance — imagine the stock market panic — or healthcare. Self-improving AI, it has been theorised, could threaten the existence or the flourishing of our species.

Despite these circumstances, there is little to no regulation of AI. In California, Governor Newsom vetoed a bill that would have made developers such as OpenAI legally liable for misuse of their models. In the UK, the government’s AI Security Institute (AISI) assesses the ability of new AI models to assist in the creation of weapons of mass destruction — but does not yet have the power to enforce changes to the models. As for the Trump administration, Elon Musk lent his name to an initiative warning of the risks of AI — but that was two years ago, and he has since set up a frontier lab of his own, xAI. In February, Musk’s new colleague JD Vance told world leaders that “the AI future is not going to be won by hand-wringing about safety”.

Such pronouncements seem incommensurate with the severity of the threats that AISI is screening for. AI is a hugely complex social phenomenon as well as a technical one, which means that its risks are harder to get our heads around than the threat of an oncoming asteroid. Similarly, it seems that we have learnt little from the Covid pandemic, whose first lockdowns were imposed five years ago this month. The disease forecasting company Airfinity suggested in 2023 that a similar pandemic of similar magnitude could emerge in the next 10 years. Climate change makes that possibility worse; a meta-analysis published in Nature in 2022 found that 58% of infectious diseases “have been at some point aggravated by climatic hazards”.

And while the risk posed by asteroids remains static, the risk of pandemics is increasing. Last December, a group of concerned scientists warned in Science of risks posed by hypothetical “mirror life”. Mirror life are yet-to-be life-forms whose DNA and proteins and other molecules are inverted, as if in a mirror. This change could, according to other scientists, make simple viruses much harder to catch by the immune systems of both plants and animals. Referring to falling costs, continuing innovation and lack of regulation, the scientists said that mirror life could be developed within the next decade. Researchers are now beginning to discuss how best to ensure the risks of mirror life never arise.

Mirror life is just one variety of potentially catastrophic pathogens that could, in theory, one day slip out of a lab. Worldwide, there are dozens of biolabs that deal with dangerous pathogens, more than enough to give us uncomfortably high odds of an experimental virus escaping its creators.

Biot, when examining the fallen space rocks nearly 225 years ago, showed us how to scrutinise the evidence. The modern era demands a deeper level of analysis — one that goes beyond the forecasting of particular events, and addresses the root causes of the dangers created by humans. Asteroids will not kill us, but other perils pose us greater risk, especially when they do not make intuitive sense to human minds. Don’t just look up for existential threats; look around.


Elizabeth Howell is a space and business journalist based in Ottawa, Canada.

howellspace