In late 2023, the FTC fined drug store chain Rite Aid for its use of AI-powered facial recognition software in preventing shoplifting. The FTC found that Rite Aid did not properly assess the technology ahead of its deployment, concealed its use, and relied on inaccurate and biased results that unfairly targeted shoppers as potential suspects.
Why It Matters
The FTC's order in this case is only binding on Rite Aid, but it gives clear signals as to what the FTC will consider in assessing AI and facial recognition/biometric information practices. Among other things, the FTC will require Rite Aid to conduct risk assessments ahead of using such technologies to determine their potential benefit to the company and potential impact (including biased outcomes) on shoppers; evaluate proposed suppliers' technology for accuracy and other qualities; require the company to disclose use of such technology to shoppers and devise a consumer complaint process; train employees on its use; and establish a biometric data security program. Any other company considering use of AI and facial recognition (or other biometric data-powered systems) would do well to review the Rite Aid order and consider which measures it can adopt and document internally.
Subscribe to Taylor English Insights by topic here.