Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good article, however-

"Due to how Apple handles cryptography (for your privacy), it is very hard (if not impossible) for them to access content in your iCloud account. Your content is encrypted in their cloud, and they don't have access. If Apple wants to crack down on CSAM, then they have to do it on your Apple device"

I do not believe this is true. Maybe one day it will be true and Apple is planning for it, but right now iCloud service data is encrypted in the sense that they are stored encrypted at rest and in transit, however Apple holds the keys. We know this given that iCloud backups have been surrendered to authorities, and of course you can log into the web variants to view your photos, calendar, etc. Not to mention that Apple has purportedly been doing the same hash checking on their side for a couple of years.

Thus far there has been no compelling answer as to why Apple needs to do this on device.



> why Apple needs to do this on device

Presumably to implement E2E encryption, while at the same time helping the NCMEC to push for legislation to make it illegal to offer E2E encryption without this backdoor.

Apple users would be slightly better off than the status quo, but worse off than if Apple simply implemented real E2E without backdoors, and everyone else's privacy will be impacted by the backdoors that the NCMEC will likely push.


> make it illegal to offer E2E encryption without this backdoor.

It isn’t a back door to E2E encryption. It can’t even be used to search for a specific image on a person’s device.

It could be used possibly to find a collection of images that are not CSAM but are disliked by the state, assuming Apple is willing to enter into a conspiracy with NCMEC.


It seems that Apple does not need to be a co-conspirator, and that it would be sufficient if someone added ‘malicious’ hashes to the NCMEC database.


Not correct. When enough images to trigger a match are detected apple employees verify the visual derivative to make sure it matches before an alert is generated. They would need to collude.


You’re right, I was thinking about a breach of privacy in general instead of actual legal consequences. (Though the possibility of governments backdooring Apple’s servers to access decrypted files stands, that shouldn’t make a difference with this iCloud-Photos-only spyware)


Whether users are worse off or not entirely depends on the rate of false positives. That's the biggest issue IMO and OP's article points out how there is zero published and trustworthy information on that.


I think that section was rewritten, it currently reads:

> [Revised; thanks CW!] Apple's iCloud service encrypts all data, but Apple has the decryption keys and can use them if there is a warrant. However, nothing in the iCloud terms of service grants Apple access to your pictures for use in research projects, such as developing a CSAM scanner. (Apple can deploy new beta features, but Apple cannot arbitrarily use your data.) In effect, they don't have access to your content for testing their CSAM system. > If Apple wants to crack down on CSAM, then they have to do it on your Apple device.

(which also doesn't really make sense, if the iCloud ToS don't grant Apple the necessary rights to do CSAM scanning there, they could just revise it, however, I think they probably have the rights they need already)


Indeed, for three years or so Apple's privacy policy has specifically had this in it-

"Security and Fraud Prevention. To protect individuals, employees, and Apple and for loss prevention and to prevent fraud, including to protect individuals, employees, and Apple for the benefit of all our users, and prescreening or scanning uploaded content for potentially illegal content, including child sexual exploitation material."

https://www.apple.com/legal/privacy/en-ww/

Under "Apple's Use of Personal Data". They had that in there since at least 2019.

Add that an Apple executive told congress two years ago that Apple scans iCloud data for CSAM.


You are correct — most of the iCloud data is not end-to-end encrypted. Apple discusses which data is end-to-end encrypted at https://support.apple.com/en-us/HT202303


They want to move to e2e for photos so they don't have to keep those keys. That's what this is part of — a way to prevent their service from being used for CSAM, yet still provide e2e encryption. I feel very ambivalent about this.


That was never mentioned by Apple. If that was their intention then I suspect they would have mentioned it alongside this announcement to provide a justification and quell the (justified) outrage.

I also question the value of e2e there’s an arbitrary scanner that can send back the unencrypted files if it finds a match. If apple’s servers controls the db with “hashes” to match then is it all that different from apple’s servers holding the decryption keys?

Sure e2e still prevents routine large scale surveillance but at the end of the day if apple (or someone that forced apple) wants your data, they’ll get it.


They don’t send unencrypted full-res files, they send low res “visual representation” and can only decode if they get > x “hits”. Assuming it works as described I do think it’s better than just having full keys as they do now. And why else would they go to all this trouble? They can scan images now on their servers if that’s what they want.


Low-res I suppose is better but...If it's enough for a human to tell whether it's CSAM or not, it's probably high-res enough to be a significant invasion of privacy in case of a mistake.

Also the > x "hits" part is a good feature assuming that the database only looks for CSAM. Otherwise it's useless (not to mention totally unauditable).

My guess is that they're doing it on device because they've had several years of marketing and proclaiming that "everything is done on-device" so to implement CSAM scanning server side would go against that. Maybe they thought this would somehow look better to the average consumer who thinks "on-device" is automatically better?


Do you actually think they didn't have CSAM scanning implemented server-side before this?


> If that was their intention then I suspect they would have mentioned it alongside this announcement to provide a justification and quell the (justified) outrage.

It’s August. New iPhones and iOS / macOS are released in September. If they want to introduce E2E encryption for photos and need this in place to do it, then it makes sense to announce this ahead of time and get the backlash out of the way so that they can announce the headline feature during the main event without it being overshadowed.


Software features are announced at WWDC, which happens early summer and already happened this year. It's only the hardware that gets announced (and the software released) in September.


Some software features are announced at WWDC, mostly ones that affect developers. Consumer-facing services have also been announced during the September event; it’s not just hardware. Apple One, Apple TV+, and Apple Arcade were all announced during the last couple of September events.


> They want to move to e2e for photos so they don't have to keep those keys.

That's my suspicion too, but has it actually been confirmed?


Well why else would they bother? This is WAY more complicated than just scanning on their servers.


We have no information to make a decision there. It could be (as you say) because they want to implement E2E cloud, but it could just as well be because they want to start scanning offline content for folks that opted out of iCloud storage (and even if that is not their immediate intent, you can't argue implementing this system doesn't take them 95% of the way to making that possible)


Probably because Apple wants to stay away from any suspicions that they sometimes actually use their keys to access private information.


> Thus far there has been no compelling answer as to why Apple needs to do this on device.

"You scratch my back and I scratch yours". Apple doesn't want to go through an antitrust lawsuit that will kill their money printer so they kowtow with these favors.


That's obviously completely untrue.

It doesn't matter anyway, end to end cryptography is meaningless if someone you don't trust owns one of the ends (and in this case, Apple owns both.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: