X Close

Why a little knowledge is a dangerous thing Almost everything that is wrong with the world is caused by coordination problems

They're talking about how the best findings in social psychology never replicate (Photo: 20th Century Fox)

They're talking about how the best findings in social psychology never replicate (Photo: 20th Century Fox)


September 9, 2020   6 mins

My favourite film is Fight Club. I worry sometimes that that might trigger alarm bells in a certain kind of mind, like those people who think that enjoying David Foster Wallace’s book Infinite Jest makes you ideologically unsound. But I was 18 when it came out and it absolutely blew me away. I’m sure there are better films, but they didn’t happen to come out when I was an impressionable teenager.

Anyway. In my fond imaginings, when my children are old enough, we’ll watch Fight Club together. I would like to see their reactions to it. Specifically, I would like to see their reaction to the twist at the end of the second act. Then, even better, we’d watch it again, and see if the film stands up on a second viewing, when you know what the twist is. (It does.)

But, realistically, that will never happen. Because there is just no way they are going to get through the next several years of their life without someone, somewhere, spoiling the twist for them. Even if almost everyone agrees on a norm of “no spoilers”, it just takes one person to disagree with that norm, and tweet out/post on Facebook/stand in Piccadilly Circus with a megaphone shouting “Bruce Willis was dead the whole time”* to undermine the whole thing. And if there are millions of people, you can be pretty sure that someone is going to do exactly that. They don’t do this because they want to ruin films for everyone: they actually think that it’s fine to give away plot points, as long as the film is sufficiently old.

This is not going to be a piece about spoilers, although I am tempted to make it one. My own position is that now that we can all watch any film from any era at a moment’s notice, wouldn’t it be polite to try to keep plot details quiet? I still wonder what it would have been like to watch The Empire Strikes Back when it came out and not know the I-am-your-father twist. If you want to talk about the plot, do it in DMs, not in public.

But I digress. There’s a concept, first mooted by the philosopher and AI theorist Nick Bostrom, called the “unilateralist’s curse”. It’s probably easiest to illustrate with an example, so here’s Bostrom’s own. Imagine you have a group of scientists researching a vaccine into HIV. While doing so, they accidentally discover a way to make a variant of the disease which can spread via air droplets. They could publish their findings, or keep them quiet, “knowing that it might be used to create a devastating biological weapon, but also that it could help those who hope to develop defenses against such weapons”.

Most of them think it’s a bad idea to publish, so they keep quiet. But one disagrees, and mentions the finding at a scientific conference, so the discovery spreads rapidly.

The two options are asymmetric: keeping the information quiet relies on everyone doing it, while making it public can be done unilaterally. If the dissenting scientist is wrong, and making the information public is actually a terrible idea — which, let’s face it, sounds likely — then the wrong outcome is almost inevitable, at least if there are lots of people involved.

Bostrom’s HIV example may sound somewhat far-fetched, but it’s not. In 2016, Canadian scientists revived the extinct horsepox virus, using DNA sequencing techniques. Horsepox is a close cousin of smallpox; the scientists believed that it could be used to create a better smallpox vaccine. Others pointed out that the same technique could also be used to bring back smallpox itself. In 2018 they published their findings, despite widespread concerns that the risks outweighed the benefits.

Or consider the development of the nuclear bomb; Bostrom points out two occasions when the unilateralist’s curse has applied. In 1939, the Polish physicist Joseph Rotblat “noticed that the fission of uranium released more neutrons than used to trigger it, realising that it could produce a chain reaction leading to an explosion of unprecedented power”, Bostrom writes. Other scientists, he assumed, had had this insight too. Initially he kept it to himself, but then grew concerned about German efforts to create a bomb; so he told James Chadwick, his colleague at the University of Liverpool, and essentially started the British nuclear programme. Later he would decide this had been a bad idea, and helped start the Pugwash anti-nuclear campaign, for which he would win the 1995 Nobel peace prize.

And in the 1970s, Progressive magazine published the mechanism for the hydrogen bomb, which at the time was only known by four countries, arguing in a lawsuit ahead of publication that nuclear secrecy was a greater risk than that of other nations developing the bomb.

All these examples are of the spread of information, which has the toothpaste-like quality of being easy to squeeze out but hard to put back in the tube. But the unilateralist’s curse applies to other situations. For instance, it only takes one person to introduce a breeding colony of rats to a Galapagos island — or rabbits to Australia — to devastate an ecosystem.

Or, very soon — maybe even right now — we could take steps to geoengineer our planet to reduce the impact of climate change, perhaps by pumping aerosols into the stratosphere to reflect some of the sun’s rays. It might be a good idea; it might not. But if, say, 50 countries have the technological capacity to do it, and 49 countries decide it’s too dangerous, then it will happen. (And, of course, now it’s not just nation states who can do this: Silicon Valley billionaires are discussing geoengineering methods of reducing ocean acidification. The more potential actors there are, the more likely it is that one of them will act.)

More prosaically: almost all of us agree that abusing women and people of colour on social media is not a good or helpful thing. But some small percentage disagree, and it only takes a very tiny percentage to send a huge number of abusive messages and make Twitter an unmanageable hellscape.

Or much less prosaically: Bostrom and others, such as his fellow Oxford philosopher Toby Ord, worry about existential risks — that is, things which could wipe out humanity, or at least permanently and severely limit its future. The most pressing, they think, are bioengineered pandemics and artificial intelligence, both of which are vulnerable to the unilateralist’s curse — both at the development stage (if 99.9% of the world’s AI companies agree to keep their amazing superintelligent AI designs secret, but one single company publishes its code on Github, then it’s out in the open) and at the use stage (it only take one psychopath to release a genetically modified virus on the Tube, even if no one else thinks that’s the best idea).

These are incredibly hard problems to solve. How do you convince hundreds of people to abide by a decision not to share some information, if some number of those people are sure that sharing that information is the good and moral thing to do? How do you convince millions of people not to send horrible messages on Twitter, if some small number of those people think that those horrible messages are just what the recipients deserve? But if you don’t convince all of them — or at least almost all — then it hasn’t worked.

(And remember, this is all just assuming that everyone is acting in what they think is humanity’s best interests. If some people are selfish bastards or appalling trolls, then it all gets even worse.)

Bostrom considers a “principle of conformity”: a rule or social norm that we don’t act unilaterally even when we think doing so would be for the common good. It’s quite a small-c conservative idea — Chesterton’s Fence-like — but, as Bostrom points out, there are obvious concerns, not least that quite a lot of good has come in the past from people acting unilaterally (he gives the example of the leak of the Pentagon Papers). Plus (it seems to me) the principle of conformity itself suffers from the unilateralist’s curse — if 99% of us agree to conform to it, but 1% don’t, then a lot of unilateral decisions are still going to get taken.

I have no clever answer for this. It seems to me a really big problem; it’s part of a wider group of problems, coordination problems, that come when you have lots of individuals making decisions.

Game theory is all about coordination problems. Take the tragedy of the commons: you have some common resource of constant size, such as a meadow. Imagine it can support 100 cattle comfortably, and there are 10 farmers. If each farmer has 10 cattle, everything works out fine. But if one farmer adds an 11th cow, then that farmer gains 100% of the benefit from that cow, but pays only 10% of the cost. And if another farmer does the same, and another. It’s in everyone’s best interest if they all stick at 10, but it’s in each individual’s best interest if they add more and more: so the resource inevitably gets degraded. And if your neighbour does it, you’d be a sucker not to do it yourself. The parallels with the climate crisis could hardly be more obvious.

Almost everything that is wrong with the world, I think, is caused not by people being bad or evil, but by coordination problems. The unilateralist’s curse shows that this doesn’t just apply when people are acting self-interestedly, but even when everyone is trying to be noble and good. And short of banning whistleblowers, ending open science and generally sliding into totalitarianism, I’m not sure what the best way to minimise the problem is.

Film spoilers are, probably, the most trivial example of this you can imagine. But they matter to me. By the time my kids are old enough to watch an 18-rated film, probably in the year 2027 or something (I won’t make them wait until they’re actually 18, obviously; I’m not Mary Whitehouse), it is functionally impossible that they won’t have seen some meme or other spoiling Fight Club for them. Someone will spoil it for them. And, by extension, some bugger is clearly going to use a lorry full of drones to release aerosolised smallpox over New York City, and there’s sod all we can do to stop it.

 

*Yes, this is a different film. No, I’m not giving away the twist in Fight Club in a piece pegged to the idea of not giving away the twist in Fight Club

 

 


Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments