Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the past they've announced local scanning on the iPhone. They postponed it indefinitely after the public feedback, and now completely cancelled it.


The last paragraph of the story literally says that they're using it as an alternative to the icloud scanning plan.


The last paragraph of the story is a concluding statement by the author on the difficulty of countering CSAM, and it says no such thing.

The announced opt-in feature for iCloud family accounts (Communication Safety for Messages) will scan content that is sent and received by the Messages app, and alert the associated parent or caregiver directly, without informing Apple.


This was the last paragraph before WIRED edited the article to add commentary from RAINN as the last paragraph:

>"Technology that detects CSAM before it is sent from a child’s device can prevent that child from being a victim of sextortion or other sexual abuse, and can help identify children who are currently being exploited,” says Erin Earp, interim vice president of public policy at the anti-sexual violence organization RAINN. “Additionally, because the minor is typically sending newly or recently created images, it is unlikely that such images would be detected by other technology, such as Photo DNA. While the vast majority of online CSAM is created by someone in the victim’s circle of trust, which may not be captured by the type of scanning mentioned, combatting the online sexual abuse and exploitation of children requires technology companies to innovate and create new tools. Scanning for CSAM before the material is sent by a child’s device is one of these such tools and can help limit the scope of the problem.”

Those quotes are a continuation of the statement from "The Company" ie Apple.


Ah, I think you're confused by the way the preceding paragraph ends ("Apple told WIRED that it also plans to continue working with child safety experts [...]").

The paragraph you're quoting ("'Technology that detects...scope of the problem.'") is entirely commentary from Erin Earp at RAINN, and is what was added by WIRED with the edit.

And, sorry to nitpick, but "Countering CSAM is a complicated and nuanced endeavor [...]" has always been the last paragraph (both before and after the edit).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: