The state regulator charged with enforcing California's comprehensive privacy law has given official notice of a rulemaking regarding proposed regulation of certain aspects of AI tools. The draft regulations, which have been on hold for several months, would use the state's privacy law as the “hook” to also require covered businesses to disclose use of automated decision making technologies (ADMT) to California residents, and allow them to opt out of its use. The decision to proceed with the rulemaking was made on a 4-1 vote by the regulator, indicating strong support to enact some form of the regulations following a public comment process.
WHY IT MATTERS
Companies covered by the California privacy law (CCPA and CPRA) have spent substantial time and resources since 2020 updating their practices and privacy policies to comply with the broad requirements of notice, access, right to delete/correct, right to opt out of sale or sharing of personal information, right to limit use of sensitive personal information, and more. If the proposed new regulations are passed, those same companies will have to go through a similar exercise with respect to ADMT. The rules could cover an extremely broad range of technologies, including (for instance) employee evaluation tools or tracking and profiling tools used in advertising.
For any covered company using a covered ADMT, there will be several steps to compliance, including the following:
- Auditing internal tools to see where and whether any ADMTs are in use,
- Confirming whether their use is covered by the law,
- Drafting and publishing appropriate disclosures to employees and customers,
- Implementing safeguards where required, and
- Ensuring a back-end process to facilitate opt-out and access requests.
The rules would likely take effect in 2025 or 2026.