Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't know how I feel about all of this yet (still trying to understand better), but your post implies that you've made a lot of incorrect assumptions about how this system works.

For example, the main system in discussion never sends the image to Apple, only a "visual proxy", and furthermore, it only aims to identify known (previously cataloged) CSAM.

There's a [good primer of this on Daring Fireball](https://daringfireball.net/2021/08/apple_child_safety_initia...)



If the visual proxy is enough to determine CSAM from non-CSAM, it's a significant invasion of privacy. Sure a thumbnail is less information than full-res but not that much less.


FWIW I'm not defending this, but it's important to get the facts correct.

1) Someone can't just randomly review one of your images. The implementation is built on threshold secret sharing, so the visual derivative can't be reviewed (is cryptographically secure) unless you hit the threshold of matched content.

2) You're uploading these files to iCloud, which is currently not end-to-end encrypted. So these photos can be reviewed in the current iCloud regime.


Yeah aware of this.

1) Still, I'm unable to audit this protocol which has a threshold I'm not allowed to know. It also always comes back to control over the "hash" DB. If you can add anything to it (as apple could), then the threshold part becomes more trivial.

2) My understanding was that they currently don't but perhaps I'm incorrect. I know for a fact that they give access to law enforcement if there's a subpoena however. Also, there is a difference in terms of building in local scanning functionality. When it's done on their server, they can only ever access what I have sent. Otherwise, the line is much fuzzier (even if the feature promises to only scan iCloud photos).


Legally, a visual proxy of CP is CP


And?

My point about visual proxies was in reference to the OP's point:

> Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.

I never said that a visual proxy/derivitive wasn't CSAM.

I assume your point had something to do with the legality of sending this data to Apple for review?

I'm not a lawyer, and I have read that NCMEC is the only entity with a legal carve out for possessing CSAM, but if FB and Google already have teams of reviewers for this type of material and other abuse images, I imagine there must be a legal way for this type of review to take place. I mean, these were all images that were being uploaded to iCloud anyway.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: