May 5, 2023 - 3:00pm

If you walk past a mobile camera van in central London this weekend, your face may be one of millions scanned, measured, turned into a matrix of numbers and compared against a police database of similarly encoded faces. If yours closely resembles a face on the “Wanted” list, you could be stopped by a human being and asked to prove that you’re not the person being sought for arrest. That’s because the Metropolitan Police has announced that they intend to use Live Facial Recognition (LFR) technology to watch the Coronation crowds. 

Back in 2020, I sidled up to a bearded man in a café in San Francisco. “Excuse me,” I said, “aren’t you Representative Aaron Peskin? I’ve been emailing your office to ask for an interview.” I was writing about how San Francisco became the first city in the world to ban Facial Recognition Technology. A handful of other US cities quickly followed San Francisco’s lead in banning (or severely restricting) police use of FRT.

I didn’t admit that, to confirm my suspicions the man sitting across the café was the same man responsible for that legislation, I had done a furtive internet search on my phone and compared his face with identified photographs in the public realm.

Live Facial Recognition technology has automated a similar process to the one I used to track down Peskin, comparing the face in front of the camera to a verified photograph, and then checking directly with the individual on the spot. Met Police cameras will not record images, or attempt to identify every passer-by. A watchlist of faces will be compared to the people walking past. Plausible matches will alert a human operator, who must judge whether to engage with the matched person.

The Met now requires authorisation by a senior officer, detailing the legitimate aim, legal basis, necessity, proportionality, and impact assessments, before LFR is used. It recognises the importance of transparency to retain public trust, and also the potential deterrence effect of highly visible deployment on individuals who suspect they may be on a watchlist. This is certainly an improvement on the early, unregulated deployment I wrote about in 2020. But it is still a step towards a world of ubiquitous surveillance, and away from freedom of movement and association.

Alun Michael, South Wales Police and Crime Commissioner, defended LFR to me in a 2021 radio programme. How was it different from posting a policeman above a crowded station to watch out for known suspects, he asked? Apart from being more efficient, of course. There are pragmatic objections to the limitations of the technology. In particular, it’s poor at distinguishing non-white faces, meaning more wrongly flagged dark-skinned people, who already feel unfairly scrutinised by the police. But this is not the main reason to be concerned about routine use of LFR.

It’s good that LFR doesn’t keep a record of all the innocent faces passing by, but it still checks each one against a list. It is, in that sense, equivalent to asking to see every person’s ID to check that they’re not wanted for arrest or trial. Even if no record were kept of all those ID cards, we would still feel scrutinised. We’d be justified in feeling unable to move freely around a city without proving that we were not a suspicious person. 

Like fingerprints or DNA, our faces are unique enough to declare who we are, with or without our consent. We will never be a “papers please” society now, because papers will soon be unnecessary to identify us anywhere we go, in public or online. If we want privacy, we will have to make the moral, legal and political case to defend it.

Timandra Harkness presents the BBC Radio 4 series, FutureProofing and How To Disagree. Her book, Big Data: Does Size Matter? is published by Bloomsbury Sigma.