Apple delays controversial child safety features after privacy protests

Apple delays controversial child safety features
x

Apple delays controversial child safety features 

Highlights

Apple is delaying its child safety features announced last month, including a controversial feature that would scan users' photos for child sexual abuse material (CSAM), following intense criticism that the changes could decrease user privacy. The changes were scheduled to be implemented later this year.

Apple is delaying its child safety features announced last month, including a controversial feature that would scan users' photos for child sexual abuse material (CSAM), following intense criticism that the changes could decrease user privacy. The changes were scheduled to be implemented later this year.

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material," Apple said in a statement to The Verge. "Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

Apple's original press release on the changes, which was aimed at reducing the proliferation of child sexual abuse material (CSAM), has a similar statement at the top of the page. That statement detailed three important changes in the works. A change in Search and Siri would point to resources to prevent CSAM if a user searched for related information.

The other two changes came under more significant scrutiny. One would alert parents when their children were receiving or sending sexually explicit photos and blur those images for the children. The other would have scanned images stored in a user's iCloud Photos for CSAM and reported them to Apple moderators, who could then forward the reports to the National Center for Missing and Exploited Children, or NCMEC.

Apple extensively detailed iCloud Photo's scanning system to show that it did not undermine user privacy. In short, you scanned the photos stored in iCloud Photos on your iOS device and would evaluate those photos along with a database of known CSAM image hashes from NCMEC and other child safety organizations.

Still, many privacy and security experts harshly criticized the company for the new system, arguing that it could have created a surveillance system on the device and that it violated the trust users had placed in Apple to protect privacy on the device. The Electronic Frontier Foundation said in an August 5 statement that the new system, while well-intentioned, "break key promises of the messenger's encryption itself and open the door to broader abuses. Apple is compromising the phone that you and I own and operate," said Ben Thompson at Stratechery in his own criticism, "without any of us having a say in the matter."


Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS