My point about visual proxies was in reference to the OP's point:
> Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.
I never said that a visual proxy/derivitive wasn't CSAM.
I assume your point had something to do with the legality of sending this data to Apple for review?
I'm not a lawyer, and I have read that NCMEC is the only entity with a legal carve out for possessing CSAM, but if FB and Google already have teams of reviewers for this type of material and other abuse images, I imagine there must be a legal way for this type of review to take place. I mean, these were all images that were being uploaded to iCloud anyway.
My point about visual proxies was in reference to the OP's point:
> Also, what controls exist on those who have to review the material? What if it's a nude photo of an adult celebrity? How confident are we that someone can't take a snap of that on their own phone and sell it or distribute it online? It doesn't have to be a celebrity either of course.
I never said that a visual proxy/derivitive wasn't CSAM.
I assume your point had something to do with the legality of sending this data to Apple for review?
I'm not a lawyer, and I have read that NCMEC is the only entity with a legal carve out for possessing CSAM, but if FB and Google already have teams of reviewers for this type of material and other abuse images, I imagine there must be a legal way for this type of review to take place. I mean, these were all images that were being uploaded to iCloud anyway.