Facial Recognition Technology Mistakenly Identified Me As a Shoplifter.

Feeling the need for chocolate after a difficult day, Sara walked into a Home Bargains store. Soon after, a staff member accused her of theft and demanded that she leave. Despite wanting to remain anonymous, Sara was mistakenly identified by Facewatch, a facial-recognition system. Although she proved her innocence, she was still escorted out and informed she was banned from all stores using this technology. Upset, she feared being wrongly labeled a shoplifter. Later, Facewatch admitted their mistake in a letter to Sara.

Facewatch is commonly used by UK retailers such as Budgens, Sports Direct, and Costcutter to identify potential thieves. The company, which declined to discuss Sara’s situation with the BBC, asserts that its system effectively deters crime and protects staff. Home Bargains also did not provide a comment.

Facial recognition is not limited to stores. In Bethnal Green, London, police used a van with cameras to scan faces against a watchlist, potentially leading to confrontations or arrests. Critics argue that this reduces individuals to mere facial barcodes, much like a checkout scanner.

Cc: BBC

During our recording session, the Metropolitan Police reported six arrests aided by facial recognition technology. Among those apprehended were two individuals for breaching sexual-harm prevention orders, one for a serious assault, and another for assaulting a police officer. Lindsey Chiswick, the Met’s intelligence director, commended the technology’s quick processing, noting that it can generate and compare a biometric facial image to a watchlist in under a second, discarding non-matches immediately.

Interviews with several individuals who were correctly identified by the system and confirmed by the police revealed that 192 arrests have been made this year due to the technology. Despite this, civil liberties organizations have raised concerns about its unproven accuracy, citing incidents like that of Shaun Thompson. Thompson, an employee of the youth group Streetfathers, was mistakenly identified by the system and unexpectedly detained by police near London Bridge.

ALSO READ  Women Removed from Plane After Crew Instructs Them to 'Cover Up' Due to Their Outfits
Shaun Thompson claims he was wrongly identified. CC: BBC

Shaun Thompson was detained for 20 minutes, fingerprinted, and released only after showing his passport, due to mistaken identity. He felt it was an invasion of privacy and felt he was presumed guilty. The BBC suggests the error may have been due to a familial resemblance, but the Metropolitan Police did not comment.

Silkie Carlo from Big Brother Watch has documented numerous instances of police using facial recognition and was present during Thompson’s apprehension. She notes that the public does not fully understand the technology, which she compares to a digital lineup that can lead to innocent people being questioned.

The use of facial recognition by the Metropolitan Police is increasing, from nine uses between 2020 and 2022 to 67 already in 2024. Officials claim that false identifications are rare, with one in 33,000 people misidentified, but the error rate increases to one in 40 once an alert is triggered.

Michael Birtwhistle of the Ada Lovelace Institute highlights the early stage of the technology and the slow development of legal frameworks to regulate its use, describing the situation as a “Wild West.”

In Bethnal Green, public opinion is divided; some are concerned about privacy while others support the technology if it helps reduce crime. This raises questions about the long-term effectiveness of the technology, as people may learn to evade detection.

Civil liberties advocates worry that the normalization of facial recognition could lead to pervasive surveillance, while supporters argue that these concerns are overblown. Nevertheless, some of the public appears willing to accept facial scanning for the promise of increased safety.

ALSO READ  Nigeria Begins Selling Crude Oil in Naira

Leave a Comment