Hacker News new | past | comments | ask | show | jobs | submit login

That's an interesting point. However, I'm not sure victim privacy is the reason for CSAM regulations. Rather, it's reducing the creation of CSAM by discouraging its exchange. For example, suppose instead of deleting/reporting the images, Apple would detect and modify the images with Deepfake so the victim is no longer identifiable. That would protect the victim's privacy but wouldn't reduce the creation or exchange. The fact that such a proposal is ridiculous suggests that privacy isn't the reason for regulation and that reducing creation and exchange is.



There is an utterly perverse incentive to consider as well.

If the median shelf-life of abuse evidence is shortened, in that the item in question can no longer be forwarded/viewed/stored/..., what does that imply in a world where the demand remains relatively stable?

I despise the abusers for what they do, and the ecosystem they enable. But I also remember first having this argument more than ten years ago. If you, as a member of law enforcement or a child wellbeing charity, only flag the awful content but do not do anything else about it, you are - in my mind - guilty of criminal neglect. The ability to add an entry to a database is nothing more than going, "at least nobody else will see that in the future". That does NOTHING to prevent the creation of more such material, and thus implicitly endorses the ongoing abuse and crimes against children.

Every one of these images and videos is a piece of evidence. Of a horrifying crime committed against a child or children.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: