X Close

Want to make sense of our chaotic world? James Gleick’s The Information will change the way you consider the cosmos

What do smartphones have in common with the cosmos? Credit: LUDOVIC MARIN/AFP via Getty Images


December 24, 2019   5 mins

The 64MB phone in your pocket contains 512 million bits. Bits of what? Bits of information: every single one a little digital flipflop. It’s a one or it’s a zero. And yet get enough of those ones and zeros together in one place (with the help of some cunning technology) and you have a device that might as well be The Hitchhiker’s Guide to the Galaxy.

Speaking of the galaxy, did you know that we are able to make a pretty reliable estimate of how much information there is in the entire universe? “1090 bits”, was the estimate of John Archibald Wheeler, then the last surviving collaborator of Einstein and Bohr. That’s a lot, obviously — but it’s a number you can write down. Mind securely blown.

What unites these two things — the phone in your pocket and the cosmos of which it’s an infinitesimal part — is information. And the question of how much information there is in something was effectively impossible to ask until relatively recently. If you’d asked a medieval librarian how much information there was in his library, or — for that matter — the head librarian at the British Library in 1947, they wouldn’t have understood the question. How can an amount of information be measured?

That’s where my book of the decade comes in. We live in the “information age”, but until I read James Gleick’s The Information: A History, a Theory, a Flood (2011), I had little or no idea what that actually meant. It’s the story of how we learnt to measure information, and it begins with a man called Claude Shannon, a mathematician who worked for Bell Labs in the 1940s. 

Bell was a telephone company, and a big one: “By 1948 more than 125 million conversations passed daily through the Bell System’s 138 million miles of cable and 31 million telephone sets.” Like its similarly vast successor Google, it didn’t mind giving its research people the chance to tinker around. With its “vast, monopolistic” mission, writes Gleick, “AT&T at midcentury did not demand instant gratification from its research division.” 

Anyway, what Shannon published in 1948 was a paper in an in-house magazine (The Bell System Technical Journal) called “A Mathematical Theory of Communication”. He was seeking to answer, in a systematic way, the question of what it actually was that all those miles of telephone cable were carrying. Conversations? Words? Electricity, obviously. But how do you measure that stuff? Reading the volts and amps won’t tell you much about what Mavis said to Dolly when they had a natter this morning. 

Shannon proposed the idea of a thing called a “bit”, which he described as “A unit for measuring information”. And so “Information Theory” was born. A bit is a binary digit: something is either there, or it’s not. It has a value of 0 or a value of 1. This is a tiny unit, and a quite dizzyingly big idea.

When it was made simple — distilled, counted in bits — information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. 

A bit is the atomic unit of information, in other words, and once you start to see the little buggers, it turns out they are all over the shop. Economies are systems of information. So are people. DNA, for instance, is replicable, measurable information that comes with all the other properties of information: noise and signal, deterioration in transmission, and so on. Evolution “embodies an ongoing exchange of information between organism and environment”. Matter itself can be understood in informational terms — as Gleick puts it rather poetically,  

“When photons and electrons and other particles interact, what are they really doing? Exchanging bits, transmitting quantum states, processing information. The laws of physics are the algorithms. Every burning star, every silent nebula, every particle leaving its ghostly trace in a cloud chamber is an information processor. The universe computes its own destiny.”

If, as has been argued, information is the stuff that underpins the very fabric of the universe, then this really is a theory of everything.

The joys of Gleick’s book are his ability to explain some pretty complex mathematical issues in a way that non-mathematicians will be able to understand; the fantastically agile connections he makes; and the engrossing examples he uses. Where he writes about entropy and chaos, information is an ice-cube in the gin-and-tonic of randomness, a temporary state of organisation.

After introducing Shannon, for instance (who en passant we meet having lunch with Alan Turing), Gleick zips back to look at African talking drums. The funny thing about these drums — which could communicate complex messages over large distances, and were understood by everybody — is that they seemed to send much more information than was necessary. If a drummer wanted to say “come home”, the message would be: “Make your feet come back the way they went/ make your legs come back the way they went,/ plant your feet and your legs below,/ in the village which belongs to us.” 

This looks like inefficiency — but it’s the opposite. The drums replicate a tonal language, and a common two-tone word, on the drums, could be any word with the same two tones — but glossed with recognisable standard formulas, like Homeric epithets, the meaning becomes clear. They were — not that they’d put it that way — recognising the necessity to “allocate extra bits for disambiguation and error correction”. Redundancy helps overcome ambiguity. Like when army signallers say “Bravo Victor Tango” rather than “BVT” to prevent mishearing. 

A mathematical understanding of communication starts to make it possible to talk about how that works – and to see how information isn’t just about saying stuff, but about creating a structure out of a chaotic field of possibilities. Information, as one writer put it, is “negative entropy”. Gleick gallops on through Morse code, the multilayered history of written language, the ideas behind logarithms and computation, telegraphy, cryptography, Russell and Godel, towards the compression of digital signals that makes it possible, among other things, for me to watch Wonder Woman on Netflix.  

The genius here is that Gleick gives you all the technological history, from cuneiform to cybernetics, but at the same time gives you the theoretical underpinnings — the basic ideas in probabilities and logic that make possible everything we do. That make it possible, in fact, for me to transmit this article to UnHerd, and for you to make sense of it. Make it possible for me to be here in the first place, and you likewise, come to that. 

“I think that this present century,” Shannon said in a speech at the University of Pennsylvania, “in a sense will see a great upsurge and development of this whole information business; the business of collecting information and the business of transmitting it from one point to another, and perhaps most important of all, the business of processing it.” He was certainly onto something there. 

And that’s why this book really is essential for making sense of the way we live now. It describes a revolution in the way we understand the world arguably as far-reaching and tentacular as the development of the written word; and, indeed, it encompasses that.

But in his closing chapters – which cover everything from the growth of Wikipedia to David Foster Wallace’s grim warnings of “Total Noise” — Gleick addresses what it means to be drowning in the present-day’s flood of information. How do we sift meaning from all this “information overload”?  

I was on the panel for the Samuel Johnson Prize for nonfiction (as it then was) the year Gleick’s book was published and it remains a source of regret to me that I wasn’t able to get his book on the shortlist. It’s stayed with me longer than many of the books that got further in the competition. It not only helps you make sense of the digital world (or neurobiology, or macroeconomics, or astrophysics, or linguistics), but also to glimpse the very foundations on which it’s built. That’s got to be worth — so to speak — a bit of your time. 


Sam Leith is literary editor of The Spectator. His forthcoming book, The Haunted Wood: A History of Childhood Reading, is out in September.
questingvole

Join the discussion


Join like minded readers that support our journalism by becoming a paid subscriber


To join the discussion in the comments, become a paid subscriber.

Join like minded readers that support our journalism, read unlimited articles and enjoy other subscriber-only benefits.

Subscribe
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments