January 14, 2026 - 3:40pm

Home Secretary Shabana Mahmood this afternoon stated that she no longer has confidence in the head of West Midlands Police, after it was revealed that the force used AI in compiling its risk assessment for its handling of a football match between Aston Villa and Maccabi Tel Aviv in November.

Chief Constable Craig Guildford admitted earlier today that AI invented a fictitious game between the Israeli club and West Ham. This hardly helped his case that the decision to ban Maccabi fans in Birmingham was a foregone conclusion in the face of resistance from Muslim lobby groups. It does, however, strengthen allegations that police fabricated their intelligence case. Another question springs to mind, too: why did Britain’s third-largest police force rely on AI in such a contentious piece of decision-making?

As an acolyte of the College of Policing, Guildford would be familiar with the acronym THOR — or Threat, Harm, Opportunity and Risk — which is part of policing’s ponderous “Accredited Professional Practice” for risk management. Clearly, he failed to apply this principle to his force’s reliance on artificial intelligence. The incident highlights several issues, not least the use of AI-based technologies in law enforcement and the degrading of police intelligence units. Then there’s the biggest issue of all: the poor quality of British policing’s senior leaders. Guildford cut an unimpressive figure at the Home Affairs Select Committee today, his explanations for his force’s actions failing to convince the assembled MPs.

Policing relies on vast quantities of data, of the sort LLMs excel at combing. With the right prompts, the technology could revolutionise fast-time incident response, as well as information-gathering for planned operations. It’s no exaggeration to say AI could save lives. However, this relies on those using such technology understanding and appreciating the importance of what intelligence professionals call “provenance”. That is, where precisely did a given piece of information come from? Whoever used Copilot to conjure data concerning a football match that never happened foolishly took AI at face value.

Guildford might consider how the force he led structured its intelligence units. Policing began professionalising intelligence as a discipline in the mid-2000s, resulting in the cumbersome National Intelligence Model. When combined with other legislative processes concerning proportionality, human rights and risk, NIM often proves an unwieldy and prescriptive way of receiving, interpreting and issuing information. Instead, these units — at the behest of performance-obsessed senior officers and non-operational support staff — were seen as useful stat-crunchers. It wasn’t unusual to see analysts producing specious performance tables rather than identifying suspects. Then, as austerity kicked in, police intelligence units were streamlined — or, in other words, ruthlessly carved up. In London, for example, 30-strong intelligence units were reduced to three officers.

What’s more, senior officers are increasingly promoted for their compliance with diktat and procedure, rather than their leadership potential. Home Office processes are specifically designed to ensure that only those committed to groupthink enjoy access to the highest ranks. The result is a cohort of proceduralists and mediocrities, less sighted than ever on operational details.

Guildford should have had an acute understanding of risk assessment policies and, crucially, how such decision-making should be properly logged. The investigation by Andy Cooke, the Chief Inspector of Constabulary, will no doubt highlight any weaknesses within Guildford’s command team. Sadly, the evidence so far suggests a slapdash chief officer appeasing angry local politicians, then lazily and arrogantly using “intelligence” to justify his decision. The role of AI, and a digitally-imagined football match, is a fascinating footnote to a story of very human failures. Moving forward, it would be unfortunate if this incident prevented others from exploiting such a potentially transformative technology.


Dominic Adler is a writer and former detective in the Metropolitan Police. He worked in counterterrorism, anticorruption and criminal intelligence, and now discusses policing on his Substack.