Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm honestly shocked that Apple is buying into this because it's one of those well-intentioned ideas that is just incredibly bad. It also goes to show you can justify pretty much anything by saying it fights terrorism or child exploitation.

We went through this 20+ years ago when US companies then couldn't export "strong" encryption (being stronger than 40 bits if you can believe that). Even at the time that was ridiculously low.

We then moved onto cryptographic back doors, which seem like a good idea but aren't for the obvious reason that if a backdoor exists, it will be exploited by someone you didn't intend or used by an authorized party in an unintended way (parallel construction anyone?).

So these photos exist on Apple servers but what they're proposing, if I understand it correctly, is that that data will no longer be protected on their servers. That is, human review will be required in some cases. By definition that means the data can be decrypted. Of course it'll be by (or intended to be by) authorized individuals using a secured, audited system.

But a backdoor now exists.

Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.

Here's another issue: in some jurisdictions it's technically a case of distributing CSAM to have a naked photo of yourself (if you're underage) on your own phone. It's just another overly broad, badly written statute thrown together in the hysteria of "won't anybody think of the children?" but it's still a problem.

Will Apple's system identify such photos and lead to people getting prosecuted for their own photos?

What's next after this? Uploading your browsing history to see if you visit any known CSAM trafficking sites or view any such material?

This needs to be killed.



I don't know how I feel about all of this yet (still trying to understand better), but your post implies that you've made a lot of incorrect assumptions about how this system works.

For example, the main system in discussion never sends the image to Apple, only a "visual proxy", and furthermore, it only aims to identify known (previously cataloged) CSAM.

There's a [good primer of this on Daring Fireball](https://daringfireball.net/2021/08/apple_child_safety_initia...)


If the visual proxy is enough to determine CSAM from non-CSAM, it's a significant invasion of privacy. Sure a thumbnail is less information than full-res but not that much less.


FWIW I'm not defending this, but it's important to get the facts correct.

1) Someone can't just randomly review one of your images. The implementation is built on threshold secret sharing, so the visual derivative can't be reviewed (is cryptographically secure) unless you hit the threshold of matched content.

2) You're uploading these files to iCloud, which is currently not end-to-end encrypted. So these photos can be reviewed in the current iCloud regime.


Yeah aware of this.

1) Still, I'm unable to audit this protocol which has a threshold I'm not allowed to know. It also always comes back to control over the "hash" DB. If you can add anything to it (as apple could), then the threshold part becomes more trivial.

2) My understanding was that they currently don't but perhaps I'm incorrect. I know for a fact that they give access to law enforcement if there's a subpoena however. Also, there is a difference in terms of building in local scanning functionality. When it's done on their server, they can only ever access what I have sent. Otherwise, the line is much fuzzier (even if the feature promises to only scan iCloud photos).


Legally, a visual proxy of CP is CP


And?

My point about visual proxies was in reference to the OP's point:

> Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.

I never said that a visual proxy/derivitive wasn't CSAM.

I assume your point had something to do with the legality of sending this data to Apple for review?

I'm not a lawyer, and I have read that NCMEC is the only entity with a legal carve out for possessing CSAM, but if FB and Google already have teams of reviewers for this type of material and other abuse images, I imagine there must be a legal way for this type of review to take place. I mean, these were all images that were being uploaded to iCloud anyway.


>So these photos exist on Apple servers but what they're proposing, if I understand it correctly, is that that data will no longer be protected on their servers. That is, human review will be required in some cases. By definition that means the data can be decrypted. Of course it'll be by (or intended to be by) authorized individuals using a secured, audited system.

Apple has always had the decryption keys for encrypted photos stored in iCloud, so this isn't new. They never claimed that your photos were end-to-end encrypted. I'm not sure how this is a "backdoor" unless you think there's a risk of either something like AES getting broken or Apple storing the keys in a way that's insecure, both of which seem unlikely to me.

>Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.

I'm equally interested in the review process. But while perceptual hash collisions are possible, it seems unlikely that multiple random nude photos on the same device would almost exactly match known CSAM content, which is the threshold for Apple reviewing the content.


OPs article cites the number of NCMEC reports from Apple vs. other tech giants (200 something vs 20 million something at Facebook). It is all a bit confusing and I expect most of us are learning more about iCloud than we ever planned on; Apple has been able to decrypt our iCloud photos all along but those reporting figures make it pretty clear that they haven’t been doing so en masse. This is a big shift.


> Will Apple's system identify such photos and lead to people getting prosecuted for their own photos?

No, no, no. As has been said a billion times by now, this system matches copies of specific CSAM photographs in the NCMEC’s database.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: