Now the argument coming from civil society for backdoors is based on CSAM:
> Heat Initiative is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, August 30, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature “disappointing.”
> “Apple is one of the most successful companies in the world with an army of world-class engineers,” Gardner wrote in a statement to WIRED. “It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better.”
This isn't even a recent thing anymore. "iPhone will become the phone of choice for the pedophile" was said by a senior official in 2014, when full device encryption was starting to become common.
The perfect political weapon. Anyone who opposes is automatically labeled a pedophile and child abuser. Their reputations are destroyed and they will never oppose again.
CSAM is evil, and I personally believe we should execute those who distribute it.
I have an even stronger belief in the right to privacy, and those in the government who want to break it should be executed from their positions (fired and publicly shamed).
Yeah, IIRC, there is precedent for this being prosecuted. Reprehensible as it is, it worries me deeply that consuming fiction can cross a line into illegality. Pedophilia is such a rightfully hated thing that it's a powerful motivator in politics and social action; people will throw away their lives just to spite child abusers sometimes. I think we need to be extra careful about our response to the issue because of that, especially as it pertains to essential rights.
And that's why I'm cautious about that. Actual CSAM (and pedophilia) should be punished as harshly as possible, period. Once we're out of that realm, intent begins to be relevant. Perhaps harsher punishments for pedophiles (chemical castration is an artful solution) would help quell the issues that CSAM "art" can cause.
Of course, make it quick and painless, but that behavior simply cannot be tolerated in a civilized society. Children are incredibly important, and how we treat them as a society is critical. The threats—perceived and real—to children in modern times have reduced the freedoms afforded to them, hampering their ability to develop in the real world from a younger age.
> Heat Initiative is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, August 30, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature “disappointing.”
> “Apple is one of the most successful companies in the world with an army of world-class engineers,” Gardner wrote in a statement to WIRED. “It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better.”
https://www.wired.com/story/apple-csam-scanning-heat-initiat...