I think that section was rewritten, it currently reads:
> [Revised; thanks CW!] Apple's iCloud service encrypts all data, but Apple has the decryption keys and can use them if there is a warrant. However, nothing in the iCloud terms of service grants Apple access to your pictures for use in research projects, such as developing a CSAM scanner. (Apple can deploy new beta features, but Apple cannot arbitrarily use your data.) In effect, they don't have access to your content for testing their CSAM system.
> If Apple wants to crack down on CSAM, then they have to do it on your Apple device.
(which also doesn't really make sense, if the iCloud ToS don't grant Apple the necessary rights to do CSAM scanning there, they could just revise it, however, I think they probably have the rights they need already)
Indeed, for three years or so Apple's privacy policy has specifically had this in it-
"Security and Fraud Prevention. To protect individuals, employees, and Apple and for loss prevention and to prevent fraud, including to protect individuals, employees, and Apple for the benefit of all our users, and prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."
> [Revised; thanks CW!] Apple's iCloud service encrypts all data, but Apple has the decryption keys and can use them if there is a warrant. However, nothing in the iCloud terms of service grants Apple access to your pictures for use in research projects, such as developing a CSAM scanner. (Apple can deploy new beta features, but Apple cannot arbitrarily use your data.) In effect, they don't have access to your content for testing their CSAM system. > If Apple wants to crack down on CSAM, then they have to do it on your Apple device.
(which also doesn't really make sense, if the iCloud ToS don't grant Apple the necessary rights to do CSAM scanning there, they could just revise it, however, I think they probably have the rights they need already)