Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>This is an attempt to solve the right problem (CSA)

Is it though? Does punishing people for possessing CSAM actually have any effect on the rate of CSA?

I'm not saying criminalizing CSAM isn't the right choice, but is there any data at all to suggest that punishing the kinds of people that would be caught if we scanned their cloud storage for CSAM actually have a real effect on the number of children harmed?

Because if it doesn't, than it stands that the only thing this would accomplish is the feel-good act of putting pedophiles behind bars, but I'm not really worried about pedophiles, I am worried about child rapists and child pornographers that actually commit crimes against children.

I don't see how going after end-users is going to do anything at all about the supply chain. But I see a million ways that being allowed to go after end-users would give the state lots of power to search people for just about any crime they decided was important enough to warrant this kind of invasion of privacy. If they are allowed to do this, there is a 100% chance they'll expand beyond CSAM within a few years.



> Does punishing people for possessing CSAM actually have any effect on the rate of CSA?

I think this is a more fundamental question than it seems. In a few years- probably less than it takes to actually implement these laws- almost all CSAM could be AI generated. Then consuming it would be effectively a victimless crime- basically punishing someone for his sexual inclinations in the absence of any actual victim.

Unless of course it's been proven that consumption of CSAM material substantially increases the risk for CSA behaviour in the consumers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: