Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can read a bit more about how it would work here:

https://daringfireball.net/2021/08/apple_child_safety_initia...

(No, not arresting parents over bath time.)



I want to draw attention to one point in this

> Will Apple actually flatly refuse any and all such demands? If they do, it’s all good. If they don’t, and these features creep into surveillance for things like political dissent, copyright infringement, LGBT imagery, or adult pornography — anything at all beyond irrefutable CSAM — it’ll prove disastrous to Apple’s reputation for privacy protection. The EFF seems to see such slipping down the slope as inevitable.

What seems to be missing from this discussion is that Apple is already doing these scans on the iCloud photos they store. Therefore, the slippery slope scenario is already a threat today. What’s stopping Apple from acquiescing to a government request to scan for political content right now, or in any of the past years iCloud photos has existed? The answer is they claim not to and their customers believe them. Nothing changes when the scanning moves on device, though, as the blog mentions, I suspect this is a precursor to allowing more private data in iCloud backups that Apple cannot decrypt even when ordered to.


The slippery slope has already been slid down for every iCloud user in China.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: