Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.

I think the article is wrong about this. Or, right-but-situationally-irrelevant. As far as I can tell from Apple's statements, they're doing this only to photos which are being uploaded to iCloud Photos. So, any photo this is happening to is one that you've already asked Apple to copy to their servers.

> In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple -- not NCMEC.

I also suspect this is a fuzzy area, and anything legal would depend on when they can actually be said to be certain there's illegal material involved.

Apple's process seems to be: someone has uploaded photos to iCloud and enough of their photos have tripped this system that they get a human review; if the human agrees it's CSAM, they forward it on to law enforcement. There is a chance of false positives, so the human review step seems necessary...

After all, "Apple has hooked up machine learning to automatically report you to the police for child pornograpy with no human review" would have been a much worse news week for Apple. :D



> There is a chance of false positives, so the human review step seems necessary...

You misunderstand the purpose of the human review by Apple.

The human review is not due to false positives: The system is designed to have an extremely low rate of hits where the entry isn't in the database and the review invades your privacy regardless of who does it.

The human review exists to legitimize an otherwise unlawful search via a loophole.

The US Government (directly or through effective agencies like NCMEC) is barred from searching or inspecting your private communications without a warrant.

Apple, by virtue of your contractual relationship with them, is free to do so-- so long as they are not coerced to do so by the government. When Apple reviews your communications and finds what they believe to be child porn they're then required to report it and because the government is merely repeating a search that apple already (legally) performed, no warrant is required.

So, Apple "reviews" the hits because, per the courts, if they just sent automated matches without review that wouldn't be sufficient to avoid the need for a warrant.

The extra review step does not exist to protect your privacy: The review itself deprives you of your privacy. The review step exists to suppress your fourth amendment rights.


This is the part I am very concerned about. This is definitely a violation of 4th Amendment rights because images are viewed by humans not on the device. What happened to just on device scanning for them?


There is one thing to be concerned about individuals violating terms of service and scanning on the device to identify and refer to law enforcement. It’s a WHOLE other thing to have humans somehow review images that are not in a device.


So.. how well does "human review" work with copyright on youtube?

This is basically fearmongering, and saying "if you're not a pedo, you have nothing to fear", installing the tech on all phones, and then using that tech to find the next wikileaks leaker (who was the first person with this photo), trump supporters (just add the trump-beats-cnn-gif to the hashes), anti-china protesters (winnie the pooh photos), etc.

This is basically like forcing everyone to "voluntarily" record inside of their houses, AI on that camera would then recognise drugs, and only those recordings will be sent to the police.


I was pointing out an inaccuracy in the article, not commenting about whether or not this tech is a good idea. I think if we're opposed to it, we should avoid misrepresenting it in arguments.

On which note... it really does seem to be voluntary, as there's an easy opt-out of the process in the form of "not using iCloud Photos". Or Google Photos, or assorted other services, which apparently all do similar scanning.

Yes, there's a slippery-slope argument here, but the actual service-as-it-exists-today really does seem scoped to a cautious examination of possible child porn that's being uploaded to Apple's servers.


> just add the trump-beats-cnn-gif to the hashes

They could literally do this right now server-side and nobody would ever know.


But noone uploads that to iCloud. That's why they must implement this feature client-side ("because if you're not a pedo, you have nothing to worry about"), and then enable it for on-phone-only photos too ("because if you're not a pedo, you have nothing to worry about"). Then use the same feature on OSX ("because if you're not a pedo, you have nothing to worry about").




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: