Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

These are perceptual hashes though.

What if you take a picture of your naked baby in the bathtub or your child on the beach, and the picture is very similar to a known CSAM picture with a hash in the database, enough to pass the distance threshold they're using?

The picture would be sent for screening, the Apple screener would indeed say that's a naked baby/child, and soon enough you've got the FBI (or whatever is the equivalent in your country) knocking on your door and arresting you for pedophilia.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: