The Rite Aid drugstore chain has taken some richly deserved batterings recently. First, claims stemming from its over-prescription of opioids contributed to its bankruptcy two months ago. And now the company has been banned from using facial recognition in its store security systems for a whole five years, thanks to its enthusiastic demonstration of how to grossly misuse the technology.
Rite Aid agreed to the ban in a settlement with the Federal Trade Commission (FTC) yesterday, which also covered its violation of a 2010 order around data security. Obviously, that means the company neither confirms nor denies the charges, but the details of the FTC’s complaint are grim.
From October 2012 to July 2020—when a Reuters investigation triggered its abandonment—Rite Aid’s facial recognition system led store employees to follow people around the store, ban them from shopping at Rite Aid, accuse them of being known shoplifters “in front of friends, family, acquaintances, and strangers,” detain or search them, and call the cops on them. Unfortunately, the system was unreliable and generated thousands of false positives. Rite Aid allegedly didn’t even test its accuracy before rolling it out.
Rite Aid used low-quality imagery that naturally decreased the system's accuracy, and it didn’t properly train employees in how to use it—though it did discourage them from telling customers that facial recognition was in use. The system automatically generated confidence scores to give some indication of how likely it was that the person being filmed was the same person on the database, but employees generally didn’t get to see those scores.
Here's the bit of the complaint you should all have been waiting for by now: “As a result of Rite Aid’s failures, Black, Asian, Latino, and women consumers were especially likely to be harmed by Rite Aid’s use of facial recognition technology.” But of course.
“Rite Aid's reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk," said FTC consumer protection chief Samuel Levine in a statement. “Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices.”
Here’s hoping that other retailers—and shoppers—take note. According to the FTC’s complaint: “An internal presentation advocating expansion of Rite Aid’s facial recognition program following Rite Aid’s pilot deployment of facial recognition technology identified only a single risk associated with the program.” Was it the risk of false positives? The risk to customers?
Nope—“media attention and customer acceptance.” On that point, at least, the company got it rite. More news below, and do also check out my colleague Alexandra Sternlicht’s must-read piece on what it was like to be a Figma employee learning that the company’s $20 billion acquisition just fell through.
David Meyer
Want to send thoughts or suggestions to Data Sheet? Drop a line here.