This comment is great but I feel we're also leaving out the "human element" that necessarily is part of these sorts of questions. Ultimately, we, as a people, don't want pedophiles, we don't want children abused, etc. These I feel are not controversial statements. However, allowing AI-generated CASM as a way to placate pedophiles does a few things:
Firstly, it normalizes this material, even to the tiniest degree you'd like to claim. If you bring something that was formerly illegal into the state of being legal, it becomes a part of our society. Weed is a fantastic example. It's on a great track to become a very casual drug of use in our society, and with each step forward along that path, it becomes less remarkable. When I was in high school, I was taught, by teachers and hired professionals, about the "dangers" of weed and other drugs. Now, I drive down the street and pass a couple of dispensaries selling that product and many derivatives of it, completely without drama. It is simply a thing that exists.
[And to not leave it merely implied, that's a GOOD thing.]
So, that being the case, are we as a society prepared to have a society and to live within one where something like AI-generated CASM is an accepted, to whatever degree, thing to have, sell, and create? Are we okay with that if it satiates pedophiles? Are we prepared to reckon with the consequences of that if it doesn't, and more children are harmed?
Secondly, I think we have to contend with the fact that now that this technology exists, it will continue to exist regardless of legality. This is one of the reasons I was so incredibly opposed to wide-spread and open-source AI in the first place, and at risk of sounding like "I told you so," one of the concerns I outlined many times is that this technology enables people to create... anything, at a near industrial scale, be that disinformation, be that spam, be that non-consensual pornography, be that CASM. I don't think a black box program that can be run on damn near any consumer PC that can create photorealistic renderings of anything you can describe in text is inherently, by virtue of it's being, a bad thing, but I do think it's something that we as a society are not ready for, and I was not alone in that thinking. But now it's here, and now we have to deal with it.
I'd have to look up the data, but the Netherlands has a long history of decriminalizing weed. However, decriminalization did not increase usage in the Dutch population.
I don't think you'd find nearly the unanimous agreement with that statement as mine. I personally love violent media, both the action-oriented John Wick type stuff and shooter games, and also I am an avid fan of gruesome horror too.
Firstly, it normalizes this material, even to the tiniest degree you'd like to claim. If you bring something that was formerly illegal into the state of being legal, it becomes a part of our society. Weed is a fantastic example. It's on a great track to become a very casual drug of use in our society, and with each step forward along that path, it becomes less remarkable. When I was in high school, I was taught, by teachers and hired professionals, about the "dangers" of weed and other drugs. Now, I drive down the street and pass a couple of dispensaries selling that product and many derivatives of it, completely without drama. It is simply a thing that exists.
[And to not leave it merely implied, that's a GOOD thing.]
So, that being the case, are we as a society prepared to have a society and to live within one where something like AI-generated CASM is an accepted, to whatever degree, thing to have, sell, and create? Are we okay with that if it satiates pedophiles? Are we prepared to reckon with the consequences of that if it doesn't, and more children are harmed?
Secondly, I think we have to contend with the fact that now that this technology exists, it will continue to exist regardless of legality. This is one of the reasons I was so incredibly opposed to wide-spread and open-source AI in the first place, and at risk of sounding like "I told you so," one of the concerns I outlined many times is that this technology enables people to create... anything, at a near industrial scale, be that disinformation, be that spam, be that non-consensual pornography, be that CASM. I don't think a black box program that can be run on damn near any consumer PC that can create photorealistic renderings of anything you can describe in text is inherently, by virtue of it's being, a bad thing, but I do think it's something that we as a society are not ready for, and I was not alone in that thinking. But now it's here, and now we have to deal with it.