Apple announced a new feature that will allow parents to activate a setting that detects nudity in images sent from devices connected to a family iCloud account. The system can block those images and alert parents. A more controversial feature scans images in the Cloud to compare them to a database of known child abuse images. The reactions fell along predictable lines, with child protection advocates applauding the move while privacy advocates worry about a slippery slope. It's too soon to know whether Apple will move in the direction of less privacy. For now, we can only appreciate the increased protection against child abuse materials.
| less than a minute read
Apple Announces New Program to Fight Child Exploitation
But in Apple's most technically innovative—and controversial—new feature, iPhones, iPads, and Macs will now also integrate a new system that checks images uploaded to iCloud in the US for known child sexual abuse images. That feature will use a cryptographic process that takes place partly on the device and partly on Apple's servers to detect those images and report them to the National Center for Missing and Exploited Children, or NCMEC, and ultimately US law enforcement.