You are the crowd: milling friends in London at the weekend. Credit: Ollie Millington/Getty


March 24, 2020   6 mins

There is a saying, which I think comes originally from a German government transport campaign but became an advertising slogan and general meme: “You’re not stuck in traffic. You are traffic.”

It seems relevant, now we’re in lockdown. I’d been thinking about it a lot recently with all the images of Columbia Road Flower Market, and the Tube, and various Norfolk coastal resorts, all rammed with people. Presumably those photos were taken by humans who were at the time in Columbia Road Flower Market, or on the Tube. You’re not in a crowd; you are a crowd. No doubt you have excellent reasons for being there, but no doubt so does everyone else. Hence, I presume, the lockdown.

It’s the same with panic-buying (with the caveat that true “panic-buying” and major stockpiling probably isn’t actually going on very much, as the excellent Stephen Bush of the New Statesman points out). In a chemist’s near my house, a parent was grumbling that all the Calpol had gone: “It’s so selfish. What if my kid gets ill?” But, of course, everyone else is thinking the exact same thing; that’s why there are no Calpols.

These concerns all map on very neatly to game theory, and specifically to the game-theory ideas known as “coordination problems”. The most famous two are probably the “prisoners’ dilemma” and the “tragedy of the commons”.

In the prisoners’ dilemma, two people are suspected of a crime, and are locked up separately. The police have enough evidence to convict them on a lesser charge, but they’re told – separately – that if they grass up their accomplice, the accomplice will get a longer sentence and they will go free. Of course, if they both turn grass, then they’ll both get long sentences (but these won’t be as long as if they were to stay schtum while their buddy narks on them).

The options are called “cooperate” (with your accomplice) or “defect”. So if you both cooperate, you get one year. If you both defect, you get two years. But if you defect and your accomplice cooperates, you walk free and they get three years (and vice versa).

Obviously, the best outcome from the point of view of minimising the total number of years served is for both to cooperate. You each serve one year; it’s just two years in total. But if you know your buddy is going to cooperate, it’s best to defect, because then you get away.

But your buddy knows that too! So he will defect. But then: if he defects, it’s still best for you to defect, so you get two years rather than three.

So even though you both could have got out after one year by cooperating, the optimal, “rational” (in the economic sense) decision is always to defect, so you end up serving two years. (Watch this gameshow clip for a remarkably dramatic demonstration; it’s so illustrative of the dilemma that it attracted academic attention.)

The tragedy of the commons is a similar but grander-scale idea. Imagine you have some common resource: a meadow where everyone can graze their livestock. There’s enough grass to graze 100 sheep comfortably and there are 10 farmers; if everyone keeps a flock of 10, the meadow will remain healthy and the animals will have enough to eat.

But if one farmer decides to add an 11th sheep, he gains all the value of that sheep, but pays only 10% of the cost to the resource. Of course, the other farmers make the exact same calculation, so everyone puts on an 11th sheep, and a 12th. The meadow gets overgrazed; the sheep become malnourished; and everyone ends up poorer than they would have if they’d just cooperated, but everyone was acting perfectly rationally (again, in the economic sense) at every juncture.

If you could coordinate the responses of the different agents – the prisoners or the farmers – the best outcome would be easy to achieve. If the prisoners could just talk, or some authority simply ordered the farmers to stick to 10 and punished those who didn’t, that would be the end of the problem. But the prisoners can’t talk, and there’s no Farming Czar to step in.

These are just thought experiments, but they turn up in real life. In the Cold War, the USA and USSR could have spent billions of dollars/roubles on hospitals and schools and made everyone’s life better, or on lots of big missiles with nuclear bombs on the top. You can cooperate and build the schools, or you can defect and build the missiles. As the thought experiment would predict, they ended up building the missiles.

And the parallels between the tragedy of the commons and the climate situation — it’s in everyone’s interest if we all reduce carbon use, but if you do emit carbon, you gain all the benefit of your economic activity while paying only a tiny fraction of the cost — are almost too obvious to state. (Although I did just state them anyway.) Again, if some world government were to be set up and punish countries for emitting too much, it would be simple to solve; but no real world government exists, so coordination remains hard.

In the time of the coronavirus these problems are being illustrated amazingly starkly. It’s in everyone’s interests if everyone strictly self-isolates and washes their hands 10 times a day. But these actions are costly – self-isolation is lonely, and thanks to the washing my hands at least are red and cracked and look like they’ve aged 20 years. So if everyone else were to do it, but you carried on walking around, licking lampposts, cuddling strangers and never washing your hands, just as normal, then you’d gain all the benefit of that while (assuming you’re relatively young and healthy) paying only a fraction of the increased risk.

Similarly, it’s in everyone’s best interests if we all agree to only buy one bottle of Calpol so the shelves are always full – but if you don’t trust other people to do that, it’s in your interest to buy extra, to make sure you have some. So everyone buys more than they need, spending too much money, and other people end up with none, even though there’s enough to go around.

That’s before you get into the utilitarian calculus: the moral mathematics of “if I do X, it will probably cause X good things but Y bad things; is X bigger or smaller than Y?” Doctors are going to have to decide who lives and who dies. Decisions made by politicians will do the same, and not always as obviously as you might think; Jeremy Farrar, the director of the Wellcome Trust, points out that during the Ebola crisis, more people died of malaria than the actual epidemic, because all the resources that would have been directed to curing them went to stopping the spread of the disease. 

But even on a personal level, we are going to have to make those sorts of decisions. If I take my children to the park — even if I am incredibly careful — I am slightly increasing the risk that they or I will pass on the disease. There is a small but non-zero chance that someone will die because of that decision. But if I don’t take them to the park, their lives will very probably be slightly worse. How do you trade low-probability but high-impact events off against high-probability, low-impact ones like that? How many days at the park is worth one death? The answer is not “infinity”; we don’t think that lives are infinitely valuable. We take small risks of terrible things happening all the time, in order for nice things to very likely happen: we cross the road to the restaurant, we drive to the seaside.

Now, though, those decisions are being made starkly obvious. And the awful thing is that sometimes it is rational, on the utilitarian calculus. If your children gain +10 Happiness Points from going to the park, someone dying is worth -100,000 Happiness Points, and the risk of someone dying from your specific actions is 0.005%, then since 0.005% of -100,000 is -5, the park visit is worth it. How do you factor that in, when you’re already struggling to overcome the Molochian trap of the coordination problem?

I should admit: at the weekend, we took our children to a local park, moved as far away from others as we could, and played football and frisbee just between the four of us. Around us – at a distance – we could see dozens of others, all slowly moving around in their families or other groups. It looked like everyone was horribly aware of the utilitarian trade-off they were making, even if they weren’t thinking in actual numbers. 

Millions of people around the country have been finding themselves in the middle of a massive multiplayer game of something like the coordination problems described above, played for real-world stakes of lives and happiness. People are having to try to work out what the right thing to do is, and sometimes having to do the right thing even though it is technically irrational — it will have worse outcomes for that person whatever everyone else does. I don’t blame them for not solving problems on the fly that people have spent entire PhDs on.

Coordination problems are not impossible to solve for the actors involved. The Cold War ended; progress has been made on carbon emissions. But it’s hard, and it takes time we don’t have. The Government’s lockdown represents the equivalent of the Farming Czar coming in and ordering everyone to cooperate. We’re not stuck in traffic, we are traffic, and right now we need the roads to be clear.


Tom Chivers is a science writer. His second book, How to Read Numbers, is out now.

TomChivers