Photo by Peter Macdiarmid/Getty Images for Somerset House


March 20, 2019   4 mins

Whoops! It was with the bland words “we apologise for the inconvenience” that the once mighty social network MySpace announced that in the course of a botched “server migration” they accidentally lost every bit of content uploaded to the site before 2016. That’s 50 million songs by 14 million artists (some of them, news reports solemnly warn us, including early work by the “MySpace Generation” of artists such as Lily Allen and Arctic Monkeys) as well as countless videos and photographs. All gone, gone utterly. Into the void.

Let’s say we accept that this really was simply a giant bit of incompetence rather than, as some tech people have speculated, a good way not to have to host 50 million unlistenable MP3s and toe-curling mid-1990s selfies in perpetuity. It’s disconcerting, eh? Didn’t we once believe that the internet was the place from which nothing ever disappeared? Didn’t we, indeed, once hope that?

Our default position as a species is to be in favour of memory. At the personal level we keep letters and diaries and photograph albums. At the state level we list buildings and erect monuments, fill museums and libraries and archives. We intone George Santayana’s dictum that those who cannot remember the past are condemned to repeat it. We recognise that scientific and medical knowledge are a dialogue with the past: we modify or build on it, or we overturn it. No Newton, no Einstein; no Einstein, no Hawking. The Dark Ages were so called (unfairly or not) because they were a period of forgetfulness. The notion floated in the 12th century that we are dwarves standing on the shoulders of giants still resonates.

Perhaps this is slightly heavy artillery to bring to bear on the disappearance into the digital never-never of, like, a gazillion bedroom jam sessions and early takes of a few Kate Nash B-sides. We’re not, you might think, in the same territory as the Taliban bashing up the Bamiyan Buddhas or Isis bulldozing Nimrud. We’re not looking at Venice sinking into the lagoon with a full cargo of Tintorettos. But there’s a principle here.

One of the basic enablers of human civilisation is what’s called “transactive memory”: that is, the outsourcing of personal memory to the collective, to something outside ourselves. That might be family members helping remind each other about what happened on holiday that year. It might be folk memory in the oral tradition. And later on it was the written word – and in due course the electronic ether. We share the cognitive burden and achieve more as a group than we can as individuals. Our tribal knowledge is held in common.

The problem with transactive memory, though, is that it tends to weaken individual recall – as has been recognised everywhere from Socrates’s suspicion of the written word to my consternation when Google Maps goes on the blink anywhere more than a three-minute walk from my front door. And we’ve been very blithe about the internet as the ultimate remembering machine. We tend to gloss over the way file formats become redundant, companies go bust and hardware fails. We tend not to think too hard about those old websites that now return 404 errors, or where the formatting is borked and there are red crosses or error messages where there used to be photographs.

And even to the extent that it does work there are two problems, I think, with the Funes the Memorious version of the Internet as a space where no human object is too trivial to be forgotten. One is the personal, practical one. Just as the law recognises the notion of spent convictions – that a crime committed and atoned for in youth need not follow the criminal for the rest of his or her days, and the DVLA will one day knock those points off my driving licence, there’s a rightful sense that people should not be personally and permanently accountable for everything they’ve ever said or thought or done. “Offence archaeology” – where unsavoury long-gone social media posts are spread virally in the hopes of “cancelling” the reputation of your enemies – is an object lesson in why.

If every teenage indiscretion, every embarrassing photograph, every bad-taste joke, is ineradicable, one is the permanent captive of one’s younger self, as the US presidential hopeful Beto O’Rourke has recently discovered. The child really is father to the man: and not the sort of nice father who takes you to the pictures on Saturday and bounces you on his shoulders in the swimming pool, but the sort who locks you in the cellar and batters you for speaking out of turn. To err is human and to forgive divine: but in the absence of divinity forgetting will have to do. And, indeed, the “right to be forgotten” is now enshrined in Article 17 of the GDPR.

We might add that there are good commercial and moral reasons that big data should have a limit on how much information it can hold on us and for how long. Though we should also note the irony that we zoom between being outraged at the amount of information social media companies hold on us, and stricken at the idea that they might lose our precious data.

But the wider, non-privacy-related issue – the cultural rather than the personal issue – is the vast noise-to-signal ratio. Honouring the past doesn’t mean archiving everything. Museums select. Heritage organisations select: we list and protect only, in theory, the good stuff. Libraries, too, select – even copyright libraries are archiving only those books out of the millions written and submitted that have found favour with publishing houses. But as storage has become cheaper and cheaper, this principle has started to be forgotten. Since the invention of digital photography on mobile phones, for instance, my family photograph album grows by a few dozen photographs a week. Already its sheer size renders it all but useless.

So much as we might deplore the loss of our MP3s into MySpace’s forgetting hole, it’s a helpful wake-up call. It asks us to stop assuming that everything we might want is automatically being backed up and will be available for us somewhere, and tells us that we can’t simply bung everything with complete confidence into an infinite digital attic. It asks us (both individually and at corporate level) to stop merely archiving and start curating: to select now what we want to keep and make sure that we have it in formats that we can be sure will survive.

And, at the deep level, it reminds us that the second law of thermodynamics is going to get us whether we like it or not. Everything that we rescue from time is temporary. Shakespeare’s verse may outlast that marble and those gilded tombs, but not – in the cosmic scheme of things – by all that long. All our pomp of yesterday is one with Nineveh and Tyre.

And one day, like the Anglo-Saxon poets who thought Roman ruins the work of giants, we too will gaze in melancholy wonder on the broken columns of the early digital age. “Wrætlic is þes wealstanwyrde gebræcon;/ burgstede burstonbrosnað enta geweorc.” (“This masonry is wondrous; fates broke it courtyard pavements were smashed; the work of giants is decaying.”) These words from the ninth century, I note without comment, have already outlasted the early works of the Arctic Monkeys. Ubi sunt.


Sam Leith is literary editor of The Spectator. His forthcoming book, The Haunted Wood: A History of Childhood Reading, is out in September.
questingvole