Hacker News new | past | comments | ask | show | jobs | submit login

Also, there's a "we can't show you the exact CP image that one of yours matched, because that would also be illegal" catch-22 that basically gives them the power to shroud this whole system in legally-obligated secrecy. Secrecy which will no doubt be abused to hide much more.



I do worry this is basically automating and scaling up the whole "idiot working at the mall photo development booth turns family in for child porn because they took pictures of 3 year old in bath" story that shows up every now and then.


I spent about a year working a retail photo processing lab - I can assure you that what you describe is exceedingly rare.

I saw “questionable” images on a probably weekly basis. Maybe once a month something would come through that I thought warranted bringing my supervisor over to provide a second set of eyes because I didn’t think it warranted calling the police over but wasn’t entirely sure. An example of that that comes to mind was a series of photos that appeared to show a young woman bound and gagged to a pipe in a basement area - but she was obviously of age, her eyes in the photos didn’t look fearful, and the rest of the roll showed her free and apparently happy. Finally, I was the person who accepted the roll of film and knew that she was the one who dropped it off. We didn’t call anyone on that, but I figured it wouldn’t hurt to get a second opinion. When she came to pick them up, I did ask her to review them and make sure everything was OK with them.

Twice in about a year we have photos come through that were obviously what’s now called “CSAM”. In both cases I called the police without consulting management then told them afterward. Also in both cases, the negatives were put into the store’s safe and the owners were arrested on-site when they came to pick them up.

All of this is to say - photo lab techs see some shit. If they called the police every time there was a picture of a kid in the bath, police brutality would be at an all time low because they’d not have time to respond to anything else. :)


reading through Daring Fireball's technical discussion of it https://daringfireball.net/2021/08/apple_child_safety_initia... I suppose does much to allay my worries though, although my natural distrust of authority and the powerful still makes me not like anything like this - no matter the spin.


It's exactly that though. Except instead of the "idiot" it's "completely dumb neural network".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: