Hacker News new | past | comments | ask | show | jobs | submit login

I'm OK with software generating signatures from cloud drive images to eliminate child porn pictures and catch pedophiles.



Have you ever had the impression that special interests are much more interested in copyright violations than child pornography? Or that it might extend to memes that sabotages carefully crafted propaganda? I do have that impression a lot.

I don't step in the lowest parts of the internet hell, but I am also not very picky about it. I have never encountered child pornography in 10 years of almost pathological internet usage.


Scanning for DRM violations was one of the use cases mentioned in the article linked above.


I didn't know what CSAM stood for so I first though this would be the reaction to the security issues iPhones faced. Oh, silly me.

Additionally, by calculating hashes of media on peoples devices, you can quickly determine networks. A private image you shared with your friends? Unique hash and everyone that has it is probably in your network. That is aside from the issue that they would also just read your contacts.


You should read the rest of the linked Twitter thread, because the issue is that if the hash algorithm has a collision vulnerability, any image could be manipulated to show up as "child porn" to the scanner.


Microsoft created and hosts the PhotoDNA service which all providers use, and PhotoDNA has false positives. All reports are supposed to be manually reviewed before being sent to Cyber Tip.


But, are they? The Swiss federal police weren't too happy about the reports they received from the Cyber Tip.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: