Hacker News new | past | comments | ask | show | jobs | submit login

Did you learn the same thing—that E2E encryption doesn’t need to be broken for this to work?

The only event in which Apple can gain access to your content is if you happen to have multiple CSAM matches; then they can access only the matching content, and only then if it’s manually confirmed by a human to be CSAM an action is taken.

The issue is if this type of matching is done for other purposes than CSAM; and unfortunately they gave themselves legal permission to do it back in 2019. That’s what we should object to, not CSAM reporting.




I didn't say anything about breaking E2E encryption. Anything a human in the middle can review in any event isn't E2E encrypted. Call it something else.

The issue is the hash algorithm is secret. The decryption threshold is secret. The database of forbidden content can't be audited. People claim it includes entirely legal images. And it's a small step from scanning local only files.


It's still end-to-end for non-CSAM-matching content. Sounds fair, as long as it's strictly for CSAM check purposes.


> as long as it's strictly for CSAM check purposes

And that's precisely the leaky part of this setup. Nothing about this system's design prevents that from changing on a mere whim or government request.

Next year they could be adding new back-end checks against political dissident activity, Uyghur Muslims, ... and we'd be none the wiser.


Thank you for confirming the exact point I made.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: