Have iPhones really destroyed a generation?
Credit: Getty   

Have iPhones really destroyed a generation?

It’s hard to accurately measure the effects of screen time on children. (Credit Image: Sean Gallup/Getty Images)

Every generation worries about new technologies; in Plato’s Phaedrus, Socrates mentions that some people are worried that learning to read and write “will create forgetfulness in the learners’ souls, because they will not use their memories”.

At the moment, one of the big scares is “screen time”, which is to say how much time we, and especially our offspring, are spending in front of screens of various kinds. In Britain, the entire phalanx of the media, from TechAdviser to the Daily Mail to Esquire, has published articles on the risks; in America, The Atlantic’s “Have iPhones Destroyed A Generation?”, a doom-laden piece about the social, psychological and health impacts of smartphones taken from Jean Twenge’s book iGen, was the most widely shared science article of 2017. Screen-fear sells, worldwide.

Should we be worried? Well, it’s complicated. It’s actually incredibly hard to study and report on the impacts of screen time. Here’s what I discovered.

Don't panic!

Children who use smartphones for up to two hours a day show higher levels of well-being than those who don’t.  (Credit Image: Getty Images)

Last year, scientists at Oxford University carried out an intensive, well-designed study, looking at the screen habits of 120,000 teenagers, to assess whether the amount of time a screen was used was affecting their mental wellbeing.

The short answer was: yes, but not how you’d expect.

Children who used smartphones for up to two hours a day, or video games for up to about six, actually showed slightly higher levels of wellbeing than those who used them less, or not at all. There was a dip for those who used it more, but that was very much a slight one. At the time, one of the authors told me that, even in very extreme use, the effect on your state of mind was much less than that of missing breakfast. It doesn’t mean there’s nothing to worry about, but nor is it indicative of a societal plague.

Concerns about “video game addiction” and “social media addiction” are probably overblown

Don’t believe anyone who talks confidently about “video game addiction”. (Photo by Spencer Platt/Getty Images)

The same Oxford University team carrried out another study, last year, looking at video game addiction. It took 5,000 people in the US, and measured whether they met the criteria for “addicted”, then checked again in six months. There are nine criteria, such as, “risked friends or opportunities due to games”, or, “I felt moody or anxious when unable to play”, and if you meet five of them, you’re considered addicted, according to the American Psychiatric Association.

Of the 5,000 people, some did meet the criteria at the start, and some met them at the end, but not a single person met them both at the beginning AND the end. That implies that as “addictions” go, it’s a relatively easy one to break. Compare that with, say, smoking: if someone smokes now, it’s a pretty good bet they’ll smoke in six months’ time.

And “social media addiction” is an even slipperier concept. It has not been adequately defined, and there’s an awful lot of confusion between colloquial, everyday meanings of “addicted” (“I’m so addicted to wine gums!”) and the clinical definition.

I spoke to Amy Orben, a psychologist at Oxford University who is doing her PhD on the effects of social media, and she was unwilling to even say whether there was evidence “for” or “against” it existing. “You can’t really say whether something exists when you can’t even say what it is,” she said.

Again: none of this means that there’s nothing to worry about. But it does mean that people who talk confidently about the dangers of video game addiction, or social media addiction, are doing so without any good evidence for their claims.

Unfortunately, there's not much good research out there

Does going to see Hamilton make you live longer? The evidence is hard to tease out. (Photo by Spencer Platt/Getty Images)

The debate has centred around “screen time”. That’s because it’s a hangover from the 1970s, when screens meant TV. But now, the things you can do with a screen are many and various. If all you know is that someone’s looking at a screen, they could be reading Doris Lessing on their phone’s Kindle app, or they could be playing Space Marine on the XBox, or they could be watching hentai porn on their laptop.

Do we expect these experiences to be so similar that we can lump them all under the same research heading? The Oxford University study above tried to separate out the different kinds of screen time, but not much research has done that.

And more broadly, it’s hard to study things when you can’t do proper randomised controlled trials. Say I want to find out whether watching the musical Hamilton makes you live longer. I look at all the people who watch Hamilton, I follow them for the next 60 years, and I compare them with people who don’t watch Hamilton, and lo and behold, I find that people who watched Hamilton live, on average, 10 years longer than the rest of the population.

So, Hamilton prolongs your life, is that right? Buy your tickets now! Except, of course, that people who can afford £100 for a ticket in the circle are much more likely to be well-off, and we know that rich people usually live longer. If you controlled for social class and education, you’d find the correlation disappeared.

This is true of screen time. Say I find that people who play lots of violent video games are more violent. Is that because violent video game make you violent? Maybe. But maybe violent people are more likely to enjoy violent video games. The evidence is extremely hard to tease out. So anyone who confidently tells you what screen time, or video games, or social media does to your health is making it up.

Don't forget: scaremongering sells.

In 2013 The Daily Mirror ran a front page story blaming videogames for cancer in children

Science reporting in the mainstream media is tricky. The careful, caveated, on-the-one-hand-on-the-other stuff takes a long time to research and write, and is often not as widely read as “mobile phones cause cancer”.  

So the incentives to publish the latter, which can be produced quickly and shared all over the place, are strong. Especially when they confirm a readers’ technophobia. That’s why pieces such as The Atlantic’s – which almost every scientist I’ve spoken to agrees was overhyped, won’t-somebody-think-of-the-children panic-mongering – was so widely shared, and why several newspapers wrote headlines in 2015 claiming that Call of Duty causes Alzheimer’s, off the back of a study which did not in fact mention Alzheimer’s at all.

There's no digital-world bad, real-world good divide

Too much of anything isn’t great for you. Even too much CSLewis.  (Credit Image: Danny Lawson/PA Archive/PA Images)

Not doing exercise and not sleeping enough is bad for you. This is nothing to do with screen time. If your daughter is sitting in a chair for 20 hours a day reading the whole Narnia canon, she will become less healthy.

So if phone use is getting in the way of exercising or sleeping, then that should be corrected. But that is true of literally any activity, and there’s no particular reason to think that there’s anything particularly worrying about screens. There isn’t a “digital stuff is bad” and “healthy real-world stuff is good” divide.

Fundamentally, it’s possible to do anything too much, whether it’s exercise, or book-reading, or Dark Souls III. So if you think your child is spending too much time in front of a screen, there’s only one person who is going to tell them otherwise.