The High Court will hear a landmark legal challenge later this month over the Metropolitan Police’s use of live facial recognition technology, in a case that could have far-reaching implications for surveillance practices across England and Wales.
The judicial review, brought by anti-knife crime campaigner Shaun Thompson and Big Brother Watch director Silkie Carlo, will be heard on 27 and 28 January 2026. It comes just weeks after the Government pledged to “ramp up” the use of live facial recognition by police forces nationwide.
The claimants argue that the Metropolitan Police’s deployment of the AI-driven biometric technology breaches fundamental human rights, including the right to privacy and freedoms of expression and assembly under the European Convention on Human Rights.
Live facial recognition allows police to scan the faces of passers-by in real time and compare them against watchlists compiled by law enforcement. While the technology has been used by the Metropolitan Police since trials at the Notting Hill Carnival in 2016 and 2017, its use has expanded rapidly in recent years. Police deployed the cameras 180 times in 2024 and 231 times in 2025, scanning an estimated 4.2 million faces in public spaces across London last year alone.
The force has also begun installing permanent facial recognition cameras, including in a shopping area in Croydon, while Hammersmith and Fulham Council has committed to upgrading parts of its CCTV network to include the technology.
Big Brother Watch, which is supporting the legal action, argues that the police policy governing where live facial recognition can be used is so broad that it fails to provide meaningful legal limits. Under current guidance, deployments are permitted in “crime hotspots”, access routes to those areas, public events and locations identified through police intelligence. The claimants say expert evidence shows that the definition of a crime hotspot is so expansive that most public spaces in London fall within it.
Ms Carlo warned that the growing use of the technology risks turning the capital into “a panopticon”, treating the public as “suspects in a permanent police line-up”. She said the possibility of being subjected to a digital identity check “almost anywhere, at any time” represented a serious infringement of civil liberties and had a chilling effect on protest and public expression.
The case is also the first in Europe to be brought by an individual who was misidentified by police facial recognition. Mr Thompson, 39, was wrongly flagged as a criminal while travelling through London Bridge and detained by officers for more than 20 minutes despite providing identification. He described the experience as ‘stop and search on steroids’.
Unlike the European Union’s AI Act, which generally prohibits live facial recognition except in exceptional circumstances, police forces in England and Wales have used the technology since 2016 without any specific primary legislation.
The claimants are represented by barristers Dan Squires KC, Aidan Wills and Rosalind Comyn of Matrix Chambers, alongside solicitors from Bindmans LLP. The High Court’s decision is expected to be closely watched by police forces, lawmakers and civil liberties groups alike.

Leave a Reply