WRONGFUL ARREST
Wrongful Arrest by Algorithm: The Alvi Choudhury Case and Your Rights
The arrest of Alvi Choudhury has sent shockwaves through the legal and tech communities, serving as a landmark example of how digital-first policing can lead to devastating human errors. Mr Choudhury, a 26-year-old software engineer, was working at his home in Southampton when police arrived, handcuffed him, and transported him 100 miles away.
He was detained for 10 hours for a burglary in Milton Keys, a city he had never visited. This was not a simple case of mistaken identity by a witness; it was a wrongful arrest triggered by retrospective facial recognition technology. Despite Mr Choudhury having a beard and being significantly older than the suspect in the CCTV footage, an algorithmic match was used to justify a profound deprivation of his liberty.
At DPP Law, we specialise in holding the police to account. If you have been an AI-driven error, it is essential to understand that the law, not the algorithm, still dictates the limits of police power. For anyone who has been wrongfully arrested as a result of this technology, understanding the legal landscape and your right to compensation is essential. This guide outlines the risks of facial recognition and the steps you can take to secure justice.
What are the Leading Facial Recognition Technologies Used by UK Police?
UK police forces primarily rely on three distinct applications of facial recognition technology:
Retrospective Facial Recognition
Used only after a crime has occurred, this software analyses footage from CCTV, doorbells, or mobile phones and compares it against the 19 million mugshots on the Police National Database. This is the technology that led to the wrongful arrest of Alvi Choudhury.
Live Facial Recognition
Cameras mounted on vans or tripods scan faces in a crowd in real-time, comparing them against a specific watch list.
Operator-Initiated Facial Recognition
Mobile apps on officers’ phones that allow them to scan a person on the street to verify their identity.
The primary algorithm currently procured by the Home Office is provided by Cognitec, a German company. While touted as state-of-the-art, the application of this software in real-world policing has exposed significant flaws.
How Accurate are Live and Retrospective Facial Recognition Systems?
While advocates of the technology point to high accuracy in controlled laboratory settings, the false-positive identification rate in operational policing remains a point of considerable contention.
Research commissioned by the Home Office and the National Physical Laboratory has highlighted a concerning in-built bias. At certain operational settings, the technology is significantly less accurate when identifying individuals of South Asian or Black heritage. In the Alvi Choudhury case, it was noted that Asian subjects are 100 times more likely to be misidentified than white subjects, while Black women are 247 times more likely to trigger a false match.
These errors stem from training data bias, where the AI is predominantly trained on Caucasian faces, making it less capable of distinguishing unique features in other demographic groups.
What are the Legal Limits on Police Facial Recognition in the UK?
The police do not have a blank check to use AI as they see fit. Their power is constrained by a patchwork of legislation and human rights protection:
PACE 1984 and Code G
The Police and Criminal Evidence Act 1984 (PACE) dictates that an arrest is only lawful if the officer has reasonable grounds to suspect involvement in a crime and if the arrest is necessary. If an officer relies solely on a computer match without performing basic due diligence, such as noticing a 10-year age gap or different facial hair, the reasonableness of the arrest can be legally challenged.
The Equality Act 2010
Police forces have a Public Sector Equality Duty (PSED). They must ensure that their systems do not have an unacceptable bias based on race or gender. Using an algorithm known to be significantly less accurate for certain ethnicities may constitute a breach of this duty.
Data Protection and Privacy Rights
Under the Data Protection Act 2018 and Article 8 of the Human Rights Act, individuals have a right to private life. The collection of biometric data (the unique map of your face) is highly intrusive. If the police scan you without a pressing social need, or if they retain your mugshot in the database after you have been cleared of any wrongdoing, they may be breaking the law.
The Breakdown of AI Policing
A wrongful arrest occurs when the police lack reasonable grounds to suspect an individual of a crime, or when the arrest is not necessary under PACE (Police and Criminal Evidence Act 1984) Code G. In the case of Alvi Choudhury, the system failed at three distinct levels:
- Failure of Human Oversight
The Home Office maintains that facial recognition is ‘’intelligence, not fact’’. Guidance states that a human officer must verify any match before an arrest is made. However, in Alvi’s case, the human visual assessment failed to account for obvious physical discrepancies in nose shape, lip size, and facial hair.
When officers rush in based on a computer’s suggestion, they often bypass the due diligence required by law. If an officer ignores glaring physical differences, the reasonableness of their suspicion can be challenged in a damages claim.
- The Mugshot Loophole
Perhaps the most alarming detail of the Choudhury case is why he was in the database in the first place. Alvi had no criminal record. His image was held because of a previous wrongful arrest in 2021, where he was actually the victim of an attack.
Unlike DNA and fingerprints, which have strict deletion rules, custody images (mugshots, as they are more commonly known) often remain in the system indefinitely unless a manual deletion request is made. This creates a mugshot loophole in which an innocent person becomes a perpetual suspect, liable to be flagged by an algorithm whenever someone with a similar skin tone or hair type commits a crime.
- Inherent Algorithmic Bias
As noted, recent research has exposed a concerning built-in bias in these systems. The statistical disparity in misidentification between different racial groups is a major hurdle for police forces attempting to prove their use of AI is non-discriminatory.
Legal Remedies for Wrongful Detention
If you are identified by a facial recognition camera or a retrospective facial recognition search, you are not without recourse. The algorithm does not override your fundamental civil liberties.
The Right To Compensation
If you have been wrongfully arrested due to a facial recognition error, you are likely entitled to significant financial compensation. A claim against the police can cover:
- False Imprisonment: Damages for the time you were unlawfully detained
- Aggravated Damages: If the police acted in an oppressive or insulting manner (for example, if officers mocked the physical differences between you and the suspect)
- Special Damages: To cover lost earnings or psychological support needed after the trauma
The Right to be Forgotten
A crucial part of any legal action we take at DPP Law is ensuring the data rectification of your record. We fight to ensure that the wrongful custody image and the associated biometric data are permanently scrubbed from the Police National Database, preventing you from being matched again in the future.
How DPP Law Can Help
Cases involving AI are not standard mistaken identity cases. They require a specialist solicitor who understands how to demand specific technical disclosures, such as the software’s similarity score, the deployment’s threshold settings, and the audit trail of the officer who verified the match.
As Alvi Choudhury noted in his interview with Good Morning Britain, we are at ‘’the foothills of profound technological change’’. We cannot allow the convenience of AI to replace the necessity of justice.
https://www.youtube.com/watch?v=BBKmUrppsBM
At DPP Law, we believe that technology should be a tool for the law, not a replacement for it. Iain Gould is currently representing Alvi Choudhury in his claim for damages against the police, and we will be monitoring closely how this case progresses through the courts.
If you believe you were arrested without lawful grounds, or if you are concerned about how your biometric data is being held by the police, we are here to help. Early legal support can help protect your future and clear your name. Contact DPP Law today for specialist advice from our Actions Against the Police team, led by national expert Iain Gould. Don’t let a digital error define your future.

