Hacker News new | past | comments | ask | show | jobs | submit login

Apple's claims on how it will work are also completely unverifiable. What's stopping a government from providing Apple with hashes of any other sort of content they dislike?



And then Apple reviews the account and sees that what was flagged was not CSAM. And again, the hashes aren’t of arbitrary subject matter, they’re of specific images. Using that to police subject matter would be ludicrous.


How would Apple know what the content was that was flagged if all they are provided with is a list of hashes? I completely agree it's ludicrous, but there are plenty of countries that want that exact functionality.


If they have the hash/derivative they dont need to look on device or even decrypt, theyll know that data with this hash is on device, and presumably 100s of other matching hashes from the same device


The matched image’s decrypted security voucher includes a “visual derivative.” I’m sure in other words they can do some level of human comparison to verify that it is or is not a valid match.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: