Live Facial Recognition Cameras Deployed at London Bridge Station in Police Trial
Facial Recognition Cameras Installed at London Bridge Station

Live Facial Recognition Cameras Activated at London Bridge Station

New live facial recognition cameras have been installed at London Bridge station, one of the United Kingdom's busiest transportation hubs, as part of a trial conducted by the British Transport Police. The technology, which leverages artificial intelligence, will scan the faces of millions of passengers and compare them against a database of individuals wanted for serious criminal offenses.

How the Surveillance System Operates

The cameras continuously record and analyze faces within the station, which accommodated over 54 million passengers last year. When the system identifies a potential match with someone on the watchlist, it generates an alert for review by a police officer. The officer then manually assesses the alert and conducts additional verification to determine if the individual is indeed a suspect requiring further action.

Chief Superintendent Chris Casey of the British Transport Police emphasized that this trial aims to evaluate the technology's performance in a railway environment. "We are committed to using innovative technology to make the railways a hostile place for individuals wanted for serious criminal offenses, helping us keep the public safe," Casey stated.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Privacy Concerns and Alternative Routes

During the trial period, passengers who wish to avoid being scanned by the facial recognition cameras will have access to alternative routes through the station. The police have assured that images of individuals not on the authorized database will be deleted immediately and permanently. QR codes displayed on posters throughout the station allow passengers to provide feedback on the trial.

However, privacy advocates have expressed significant concerns about the deployment of this technology. Matthew Feeney of Big Brother Watch described the "mass biometric surveillance" as "disturbing" and "disproportionate." He highlighted that facial recognition technology remains unregulated in the UK, with police forces essentially creating their own rules regarding its use and watchlist management.

Controversial History and Legal Challenges

The use of facial recognition technology has proven controversial due to instances of misidentification. In one notable case, a Londoner was mistakenly ejected from a Sainsbury's supermarket after staff using similar cameras incorrectly identified him as a criminal. Additionally, legal challenges have emerged, such as when Big Brother Watch's Silkie Carlo and anti-knife crime activist Shaun Thompson contested the Metropolitan Police's use of live facial recognition tools after Thompson was wrongly identified by a camera van in 2024.

Ruth Ehrlich, a director at the human rights organization Liberty, criticized the government for advancing facial recognition deployments while consultations on a legal framework are still ongoing. "Facial recognition enables police to track and monitor people as they go about their daily lives," Ehrlich said. "Its use to date has been deeply flawed, with children wrongly placed on watchlists and Black people at greater risk of being misidentified."

Future Expansion and Regulatory Gaps

The British Transport Police has indicated that additional trials at other stations will be announced prior to implementation. Meanwhile, the Metropolitan Police reported using facial recognition technology 231 times last year, scanning approximately 4 million faces. Despite plans to extend the use of these cameras, comprehensive regulations governing their deployment remain absent.

Privacy campaigners argue that the rapid rollout of facial recognition technology without proper safeguards poses significant risks to civil liberties. They call for immediate government intervention to establish transparent oversight mechanisms and ensure that public rights and privacy are prioritized in any future deployments of artificial intelligence surveillance tools.

Pickt after-article banner — collaborative shopping lists app with family illustration