the imessage scanning is just to provide on-device warnings and blurring of photos that are suspected of being inappropriate. nothing is transmitted off the phone — parents are only given a notification (and only if their child is under 13, and only then if the child sends or replies to nude photos after being warned by the system)
> Apple’s second main new feature is two kinds of notifications based on scanning photos sent or received by iMessage.
> To implement these notifications, Apple will be rolling out an on-device machine learning classifier designed to detect “sexually explicit images”.
> According to Apple, these features will be limited (at launch) to U.S. users under 18 who have been enrolled in a Family Account.
> In these new processes, if an account held by a child under 13 wishes to send an image that the on-device machine learning classifier determines is a sexually explicit image, a notification will pop up, telling the under-13 child that their parent will be notified of this content.
> If the under-13 child still chooses to send the content, they have to accept that the “parent” will be notified, and the image will be irrevocably saved to the parental controls section of their phone for the parent to view later.
yeah, not sure where they’re getting that info from. neither apple’s child safety landing page nor the pdf info sheet they released mentioned anything about photos being saved. the eff say that the image will be saved on the child’s device for the parent to view later, but i can’t find a source for that.
from the eff page:
>…if the under-13 user accepts the image, the parent is notified and the image is saved to the phone. … once sent or received, the “sexually explicit image” cannot be deleted from the under-13 user’s device.