Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, as usual I'm worried the people who claim that others don't read are the ones not reading (or being able to comprehend) what the author is trying to say. To me it seems like moderation in general is fine. What Apple is doing here is that after they receive a flag that a certain threshold is crossed, they manually review the material. The author states that no one should do that i.e., the law explicitly prohibits anyone even trying to verify. If you suspect CP, you got to forward it to NCMEC and be done with that.

I 100% understand why Apple doesn't want to do that - automatic forwarding - they're clearly worried about false positives. I also think Apple has competent lawyers. It's entirely possible that the author and their lawyers' interpretation could be wrong (a possibility).

Point is - the author isn't trying to say moderation is illegal.



The whole thing rests on whether Apple knows that the content is CSAM or not. And they don’t. The author gets this fundamentally wrong. They do not know whether it is a match or not when the voucher is created. The process does, but they don’t. They know when the system detects a threshold number of matches in the account, and they can then verify the matches.

Additionally, we already know they consulted with NCMEC on this because of the internal memos that leaked the other day, both from Apple leadership and a letter NCMEC sent congratulating them on their new system. If you think they haven’t evaluated the legality of what they’re doing, you’re just wrong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: