Innocent Man Detained After AI Facial Recognition System Fails
An innocent man was arrested after a police force's AI-powered facial recognition system incorrectly matched him to a burglary suspect. Alvi Choudhury, a 26-year-old software engineer from Southampton, was taken into custody on January 7 while working from home, based on a flawed identification by Thames Valley Police.
Details of the Mistaken Arrest
Choudhury's mugshot, taken during a false arrest five years ago, was matched by police systems to CCTV footage of a thief who stole £3,000 and jewelry from the Milton Keynes Buddhist Vihara in December. He was held until midnight before interrogation, where officers reportedly laughed and realized within 10 minutes that they had the wrong man for a crime 80 miles away.
Choudhury told the Daily Mail that a Thames Valley Police officer admitted knowing he wasn't the suspect before the interview, after comparing his custody photos with the footage. He expressed concern over potential racial discrimination, stating, "You've probably just seen two brown people, even though they have completely different features and said, 'yeah, they look close enough. Let's arrest them.'"
Background and Legal Action
The CCTV footage showed a younger man with curly hair, unlike Choudhury. His face was in police records due to a wrongful arrest in 2021, when he was a Portsmouth University student attacked by a gang during a night out. Police detained him and his friends despite their injuries, releasing him only after discovering another attack that night.
Choudhury has since launched legal action against Thames Valley Police. He criticized the AI technology, saying, "No tech company would ever put a system into production with a failure rate of one in 25. That's horrific. It is filled with bugs." He is calling for government re-examination and regulation of AI systems.
Racial Bias in Facial Recognition Technology
Live Facial Recognition (LFR) tools scan faces against watchlists of criminals, praised by police as a breakthrough similar to DNA matching. However, the Home Office admitted in December that the tech returns more false positives for certain demographic groups. Matches for Black faces are false positives 5.5% of the time, compared to 0.04% for white faces.
Experts note that AI systems often exhibit racial biases, struggling to identify people of color due to training data skewed toward white men. Critics warn this can lead to false matches, interrogations, watchlist entries, and job losses. Choudhury worries about appearing suspicious to employers.
Police Response and Statistics
Thames Valley Police apologized for the distress but stated the arrest was based on officers' visual assessment, not racial profiling. They emphasized that retrospective facial recognition provided intelligence but did not determine the arrest, and the arrest was not unlawful despite later elimination from the investigation.
According to the National Police Chiefs' Council, about 25,000 searches using facial recognition systems are conducted monthly. This incident underscores ongoing debates about the reliability and ethics of AI in law enforcement.



