Rite Aid’s Misuse of Facial Recognition Technology Leads to FTC Action

by time news

Rite Aid Misused Facial Recognition Technology, FTC Says

The Federal Trade Commission (FTC) announced on Tuesday that the pharmacy chain Rite Aid used facial recognition technology in a way that subjected shoppers to unfair searches and humiliation. This landmark settlement could raise questions about the use of facial recognition technology in stores, airports, and other venues nationwide.

According to the FTC, Rite Aid activated face-scanning technology in hundreds of stores between 2012 and 2020 in an effort to crack down on shoplifters and other problematic customers. However, the chain’s failure to adopt safeguards, along with the technology’s long history of inaccurate matches and racial biases, led to false accusations and humiliation of innocent shoppers.

In one case, an 11-year-old girl was searched by a Rite Aid employee due to a false facial recognition match, causing distress for the child and her family. Another incident involved employees calling the police on a Black customer after the technology mistook her for the actual target, a White woman with blond hair.

Rite Aid has agreed to a five-year settlement that includes not using the technology, deleting the collected face images, and updating the FTC annually on its compliance.

The FTC revealed that Rite Aid’s facial recognition system generated thousands of false matches, and many errors disproportionately involved the faces of women, Black people, and Latinos. Furthermore, the system was deployed primarily in stores used predominantly by people of color, leading to many shoppers feeling racially profiled.

This case is part of a broader trend of algorithmic unfairness, according to FTC Commissioner Alvaro Bedoya, who called on company executives and federal lawmakers to consider banning or restricting how biometric surveillance tools are used on customers and employees.

As a result of the settlement, the use of discriminatory and invasive facial recognition technology in retail chains may come under scrutiny, with advocacy groups urging corporations to reconsider the use of this technology.

Evan Greer of the advocacy group Fight for the Future said, “The message to corporate America is clear: stop using discriminatory and invasive facial recognition now, or get ready to pay the price.”

The Rite Aid case serves as a reminder of the need for comprehensive privacy laws in the United States to protect the public from reckless adoption of surveillance technologies, noted AI researcher Joy Buolamwini.

The case may mark a turning point in the use of facial recognition technology and the protection of consumer privacy, as regulators and advocacy groups work to address the risks and unfairness associated with this invasive technology.

You may also like

Leave a Comment