Hacker News new | past | comments | ask | show | jobs | submit login

I don't really want my family photos reviewed by strangers. "Reducing the search space" of photos on my phone isn't an outcome I want to live with. At the time someone is looking at photos of my, my wife/husband/girlfriend/boyfriend, and my kids, they'd better have a darned good reason (e.g. a search warrant).

I'd also appreciate if Apple let me know if my false positives were reviewed and found to not be CASM.




Don’t upload an image anywhere, else it can be reviewed.


I saw a story on here yesterday about iphones resetting to default settings after restarting. So people were turning off backups to the cloud, and then finding that their device turned the feature on after sometime.


The whole point of Apple's system is that I don't need to upload an image anywhere.

Images from my phone can be stolen and reviewed with no due process, based on proprietary Apple technology.


The system as described only submits its safety vouchers when photos are uploaded to iCloud.

Not saying it will stay that way, but there are three distinct realms of objection to this system, and it's probably useful to separate them:

1. Objections that in the future, something different will happen with the technology, system, or companies; so that even if the system is unobjectionable now, we should object because of what it might be used for in the future; or how it might change. 2. Objections that Apple can't be trusted to do what they say they are doing, so that even if they say they will only refer cases after careful manual review, or that they will submit images for review that were not uploaded to iCloud, we can't believe them, so we should object. 3. Objections that hold for the system as designed and promised; in other words, even if all the actors do what they say they are doing in good faith and this monitoring never expands, it's still bad.

People who have the third kind of objection need to deal with the fact that Apple is basically putting in a system with more careful safeguards than are already in place in many Internet services, even for their "private" media storage or exchange. You likely don't know how the services you use are scanning for CSAM but if the service is at all sizeable (chat, mail, cloud storage) it's likely using PhotoDNA or something similar.

I think there are valid objections on all three bases. But there's a difference in saying "this is bad because of something that might happen" and "this is bad because of what is actually happening".


I think the issue is that the content review is happening on phone, and would be a small change to go from scanning uploaded photos to all photos


Oh yes, I agree. We will see a change in privacy policy before that happens. And Apple will lose a lot of us if that comes to pass.

For many years, it happened in the cloud. Soon it will happen on device and send a message about which item in the cloud is an issue.

I think it’s all about apple moving ML jobs (like Siri) to device to lighten the load on their datacenters.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: