Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've heard it argued that this allows Apple to move iCloud toward end-to-end encryption (which would be a good thing for privacy, right?). It seems like the current US government position is "we'll let tech giants use end-to-end encryption for user data as long as they put measures in place to catch CSAM."

by "US government position" I'm including negotiations and proposed/threatened legislation, not just the current laws on the books.



If you really believe that, then I have a bridge to sell you. The exact same safety argument can and will be applied to ALL data whether on your device or Apple's servers.

Why limit to iMessage and photos? Why not Signal on your device?

Why not your backups on Apple servers? Oh wait, already happens.


I think Apple would rather put this CSAM-scanning system in place (which allows them to implement end-to-end encryption for iCloud in the future) than deal with the EARN-IT Act or similar becoming law, which could effectively make all e2e-encrypted services illegal (require a government backdoor).

>The bill also crafts two addition changes to Section 230(c)(2)'s liability, allows any state to bring a lawsuit to service providers if they fail to deal with child sexual abuse material on their service, or if they allow end-to-end encryption on their service and do not provide means to enforcement officials to decrypt the material.

https://en.wikipedia.org/wiki/EARN_IT_Act_of_2020


If that were the case, wouldn't this CSAM scanning system be insufficient to meet those EARN-IT requirements?

You have other Apple services and third-party apps that host material on Apple's servers.

For example, if a user turns on iCloud backups, then every third-party app's Documents directory is backed up to iCloud. Would it be a violation to not CSAM scan that content? What if the contents are encrypted? Would they be required to be decrypted so that they are CSAM-scanned?

iCloud drive is another Apple service that backs up to Apple's servers. Wouldn't its absence from the list be a violation? What if a user hosts encrypted files on iCloud drive? Would the user be required to decrypt them so that Apple can scan them?

It seems that the real intention is to eliminate end-to-end encryption.


>wouldn't this CSAM scanning system be insufficient to meet those EARN-IT requirements?

Yes. My point is that there's an ongoing dance between the tech companies and the government, and through their negotiations and government connections Apple probably views this CSAM-scanning move as making an EARN-IT-like law less likely to be passed. It's overall the less-invasive option. The US federal government is putting pressure on tech companies not to host CSAM, and if tech giants didn't agree to do stuff like this the government could respond by passing stricter laws to effectively make unbackdoored e2e encryption illegal.

Apple has a lot of influence but at the end of the day they're a US-based company that has to follow US laws. Voluntarily implementing CSAM-scanning is in their own interest as a "pro-privacy" company if it prevents more draconian anti-encryption laws from being passed that could effectively outlaw e2e encryption.

I don't view this as Apple singlehandedly trying to eliminate end-to-end encryption; that seems like a pretty radical view of the situation to me but of course you're free to hold that opinion.


I don’t hold the view that Apple is trying to eliminate end to end encryption. I view this as a push by governments to do so and the increasing willingness of the tech industry to work with them.

This is more like Apple giving way gradually and the government happy since in the long run they get everything they want.

Examples: we don’t unlock phones for the government… but we give them all the data if you back up your phone… but you have so much privacy!

We don’t read your messages, oh wait now we do, but only for child abuse, oh wait, we don’t control what it looks for but let’s not talk about that because it hurts our marketing


Hard to believe that a law limiting that would stand up in the supreme court, and Apple has previously indicated they would be willing to pay whatever legal costs are necessary to defend themselves from that sort of attack.

It is weird that Apple would do this in the first place though, it certainly doesn't make me want to use their products.


It moves iCloud to end-to-end encryption by compromising the ends. Not really a reasonable outcome.


> move iCloud toward end-to-end encryption (which would be a good thing for privacy, right?).

"End-to-end" encryption is nothing to strive for if you're destroying the ends. With this change, the "ends" become the users themselves. By embedding an agent acting on behalf of Apple/government, the device in front of end users is no longer a tool but rather an adversary. This is computational disenfranchisement.


Hey if the ends justify the means.

I’ll show myself out.


Feels like nobody here has bothered to read the actual technical specifications.[1]

This already adds new level of encryption into iCloud stored images. They have essentially created E2EE system with a specific access (or backdoor), while preventing the use of backdoor for other purposes than CSAM (so nobody can ask randomly to decrypt something else). They can only decrypt images, when user's account reaches the treshold of CSAM hash count:

> Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.

While this is not perfect end-to-end encryption solution, it is better than only server side encryption. Now there are two levels of encryption. If someone breaches Apple's servers and they have also access for server side private keys, they still need matching NeuralHash value to decrypt the images.

[1]:https://www.apple.com/child-safety/pdf/Expanded_Protections_...


I think the argument here is that (1) the model is going to have false positives (e.g. revealing pictures of you and your spouse, your beach photos, etc.) that will permit access for non-CSAM (or at the very least, mark your account as suspicious in the eyes of authorities), and (2) the model itself can be updated/replaced for any reason and potentially at any government's demand, so the limits on scope are effectively moot


For argument (1), they are only looking matches from existing database of hashes what NCMEC is providing. They are not developing general AI to identify new pictures, they only try to stop redistribution of known files. Because of that, their claim for 1/1 trillion false postives might be actually close to be correct since it is easily validated on development phase. Also, there is human verification before law-enforces are included.

For argument (2), this might be valid, but yet again, all we can do is to trust Apple, as we do all the time by using their closed source system. Model can be changed, but it is still better option than store everything unencrypted? In case you mean forging hashes to decrypt content.

For the sake of surveillance, it is not strong argument because again, system is closed and we know only what they say. Creating such model is trivial, and is not stopping government for demanding if Apple would want to allow that. System would be identical for antivirus engines which have existed since 1980s.

This is such a PR failure for Apple, because all their incoming features are improving privacy on CSAM area, everything negative comes from speculation which was equally already possible.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: