High Court Dismisses Legal Challenge Against Police Facial Recognition
Two individuals who launched a judicial review against the Metropolitan Police's deployment of live facial recognition cameras have seen their legal challenge dismissed by the High Court. The case, which highlighted growing concerns about mass surveillance in public spaces, represents a significant setback for privacy advocates seeking to curb the technology's expansion across London.
Wrongful Identification Sparks Legal Action
Shaun Thompson initiated the legal proceedings after the police's facial recognition system incorrectly identified him as a suspect outside London Bridge Tube station in February 2024. Thompson joined forces with Silkie Carlo, director of the privacy organization Big Brother Watch, to contest what they described as "disturbing" mass biometric surveillance of ordinary citizens.
During last month's High Court hearing, lawyers representing the pair revealed that police use of live facial recognition technology has been increasing "exponentially." According to Dan Squires KC, who presented evidence during the judicial review, the Metropolitan Police deployed facial recognition systems 231 times last year alone, scanning approximately 4 million faces across the capital.
Expansion Plans and Current Deployments
The Metropolitan Police have confirmed plans to extend the camera system's deployment, though public consultation on the expansion remains ongoing. Earlier this year, authorities activated the technology at London Bridge station, one of the United Kingdom's busiest transportation hubs with over 54 million passengers annually.
The artificial intelligence-powered system continuously scans faces within camera range, comparing them against a watchlist of serious criminals. When the system detects a potential match, it alerts a police officer who manually reviews the identification and conducts additional verification before determining whether the individual represents an actual suspect.
Privacy Concerns and Lack of Oversight
Privacy and civil liberties campaigners have issued strong warnings about the technology's rapid deployment, noting that police use of facial recognition across the country currently operates without comprehensive monitoring or specific legislation governing its application.
Madeleine Stone, a senior advocacy officer at Big Brother Watch, emphasized that existing laws have failed to keep pace with surveillance technology developments. "The police have essentially been left off the leash and can do what they want with this," Stone stated. "Everyone gets something wrong sometimes, but what happens when the algorithm gets it wrong? Who is responsible then?"
Understanding Live Facial Recognition Technology
The Metropolitan Police describe Live Facial Recognition as a tool for crime prevention, detection, and locating wanted criminals. The system streams images directly from cameras to the LFR system, where they're compared against watchlists. Authorities also note the technology can help establish identities when individuals cannot communicate who they are.
Typically deployed at major events or in crowded areas, often mounted on police vans, LFR technology first appeared in England and Wales during the 2017 UEFA Champions League final in Cardiff. More recently, facial recognition software was installed throughout Cardiff for Six Nations rugby games, though the system scanned 162,680 faces without resulting in any arrests.
The High Court's decision comes amid ongoing debates about balancing public safety with individual privacy rights, particularly as surveillance technology becomes increasingly sophisticated and widespread across urban environments.



