Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Hm, what if you modified a CSAM video to evade detection (adversarial ML etc), then injected it onto the target's machine with this fun bug, then when the target's computer automatically creates a thumbnail for the video the thumbnail would be automatically flagged.


Or just make your own fake/questionable hash collisions with a script Some Guy made on Github: https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: