I think you confuse two different things: The optional nudity detection and the mandatory iCloud CSAM scanning.
They abandoned the later and that’s what the article is about.
The first one is a child safety feature that scans content and warns the user, if enabled.
The second one scans everyone’s all content and if it detects more items than some number your data is decrypted and you are reported to the police. A deeply disturbing, dystopian feature where your personal devices become the police and snitch you. Today they say it’s CSAM but there’s no technical reason not to expand it to anything.
iCloud CSAM scanning does not do what you described in your last paragraph, you're describing the proposed on device csam scanning. There are many "product features" with a finite set of technical building blocks. Lets break it down:
-----------------------
Encryption , Storage and Transport:
-Presently, this is considered "strong" on-device where Apple claims they cannot access content on your device and it is encrypted.
-Data sent to their servers as part of the automatic (opt in) photo backup service (icloud photos) is considered fair game to scan and they do that routinely.
-This data is encrypted "End to end". What that means is up for debate since it's taken on a commercial brand shape and isn't a technical guarantee of anything.
------------------
Explicit consent for access and storage:
-This is the point. Your device belongs to you. No one should be snooping around there.
-Apple is presently retiring the aforementioned icloud photo scanning and has simultaneously released a statement that's very obscure in what precisely they mean.
-They indicate that scanning on the server is waayyy too late in preventing CSAM and indicate that they think the best way to prevent CSAM is at the point of origin.
-Of course, what that means is : "Scan everything, when something matches, do an action"
-The "Opt-in" here is for the warnings on a child device, but it is an OS LEVEL FEATURE.
-If some media is being scanned, ALL media is potentially being scanned.
-Presently, for audio, one easy way to reduce your bandwidth costs of backing up, scanning on a server etc is to move everything onto the device itself.
-This has been demonstrated by the Accessibility menu feature i have already called out in my parent comment. You have an audio buffer on your iPhone TODAY that is ALWAYS ACTIVE. When the accessibility toggle is turned on, the contents of this buffer are regularly classified against a trained model.
-When a match occurs, the OS responds with the configured response.
THIS IS A DANGEROUS FRAMEWORK. Swap the media type to any generic type and swap what you're looking for from CSAM to a mention of a political phrase such as abortion. You're asking us to TRUST that the company will never be compelled to do that by authoritarian governments or any hostile entity for that matter? No fucking thank you.
Nope. In the proposed system the scan was performed by the device, the entity that evaluated the result of the scan is a implementation detail. Technically it was also Apple that would call the police but that’s also an implementation detail.
It’s just your device scanning your files for content deemed illegal by authorities and snitching you to the authorities but with extra steps.
And how do you think that fingerprint is attached exactly? By scanning your files on your device.
Anyway what is the point of this conversation? Are we going to argue over what scanning means?
I’m sorry but I find this intellectually dampening. Okay, not scanning but analyzing and sending the results of the analysis to Apple where Apple can scan. So?
Because you said this: “personal devices become the police and snitch you”
Which makes it seem like Apple’s approach is worse than what Google does where they actually do scan through everyone’s full resolution photos on the server doing what they please with the results.
My point in making this distinction is that if a company is going to do server side scanning, apple’s approach is far more privacy preserving than a company like Google’s and that point is being lost.
Correct but this means all content if iCloud is enabled, Photos doesn't give you an option to create a folder or album where you can store your photos on-device only.
You also can't have drop in replacement alternative cloud provider if you are not OK to be scanned and reported to the authorities for government disallowed content(There's no technical reason for the reported content being CSAM only, can be anything) because alternative apps can't have the same level of iPhone integration with iCloud.
> Correct but this means all content if iCloud is enabled
That's a pretty big distinction between all content and all content on it's way to iCloud.
> (There's no technical reason for the reported content being CSAM only, can be anything)
Now we're back to all sorts of assumptions. There's also no technical reason Apple can't scan phones right now.
Apple did say at the time the hash db would be checked against different sources to prevent random content checks. And now with E2E, Apple's proposed method was more secure than what happens in the various clouds today where LEO could show up and ask them to search for anything.
Obviously the most secure is not to check at all, but the law may force the issue at some point.
They abandoned the later and that’s what the article is about.
The first one is a child safety feature that scans content and warns the user, if enabled.
The second one scans everyone’s all content and if it detects more items than some number your data is decrypted and you are reported to the police. A deeply disturbing, dystopian feature where your personal devices become the police and snitch you. Today they say it’s CSAM but there’s no technical reason not to expand it to anything.