Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think that section was rewritten, it currently reads:

> [Revised; thanks CW!] Apple's iCloud service encrypts all data, but Apple has the decryption keys and can use them if there is a warrant. However, nothing in the iCloud terms of service grants Apple access to your pictures for use in research projects, such as developing a CSAM scanner. (Apple can deploy new beta features, but Apple cannot arbitrarily use your data.) In effect, they don't have access to your content for testing their CSAM system. > If Apple wants to crack down on CSAM, then they have to do it on your Apple device.

(which also doesn't really make sense, if the iCloud ToS don't grant Apple the necessary rights to do CSAM scanning there, they could just revise it, however, I think they probably have the rights they need already)



Indeed, for three years or so Apple's privacy policy has specifically had this in it-

"Security and Fraud Prevention. To protect individuals, employees, and Apple and for loss prevention and to prevent fraud, including to protect individuals, employees, and Apple for the benefit of all our users, and prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."

https://www.apple.com/legal/privacy/en-ww/

Under "Apple's Use of Personal Data". They had that in there since at least 2019.

Add that an Apple executive told congress two years ago that Apple scans iCloud data for CSAM.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: