I can’t think of a better way to argue in favor of “LLMs are copyright laundering machines” than from the humanness angle.
Humans have rights, software tools don’t.
If you grant an LLM the full set of human rights, then it can consume information, regurgitate copyrighted works, and use it to generate money for itself. However, considering blatantly obvious theft as “homage” goes hand in hand with free will, agency, being in control of yourself, not being enslaved and abused, etc. Pondering various scenarios along those lines really gets to the heart of why an LLM is so very much not a human, and how subjecting it to the same treatment as humans is a ridiculous notion.
If you don’t grant LLM human rights, then ClosedAI’s stance is basically that pirating works is OK because they pass them through a black box of if conditions and it leads to results that they can monetize. That’s such a solid argument, it’ll surely play well in the court of law.
Training data is not an “LLM does it”; first because “it” here is not “learning” or understanding in human sense (otherwise you would have to presume that an LLM is a human), and second because a software tool doesn’t have agency and it’s really just Microsoft using a tool based on copyrighted works to generate profit.
Humans don't exactly have the greatest track record of granting other humans rights. I don't presume they'll get it any better with AI.
What I expect to happen is whoever has the most influence and power will get what they want and we'll end up raising a generation with the implicit understanding of "that's just how things are," natural order, truth, reality, and all that jazz.
The only thing that ever changes outcomes is if the contradiction status quo is incapable of being managed.
I can’t argue for or against whether LLMs should have rights or not… I can only point out the hypocrisy of claiming LLMs are “like human” enough and independent enough that their operators-become-slaveowners cannot be held to account on any copyright matters, but also claiming that LLMs are not like human at all lest someone demans them to have rights and nukes the industry.
Humans have rights, software tools don’t.
If you grant an LLM the full set of human rights, then it can consume information, regurgitate copyrighted works, and use it to generate money for itself. However, considering blatantly obvious theft as “homage” goes hand in hand with free will, agency, being in control of yourself, not being enslaved and abused, etc. Pondering various scenarios along those lines really gets to the heart of why an LLM is so very much not a human, and how subjecting it to the same treatment as humans is a ridiculous notion.
If you don’t grant LLM human rights, then ClosedAI’s stance is basically that pirating works is OK because they pass them through a black box of if conditions and it leads to results that they can monetize. That’s such a solid argument, it’ll surely play well in the court of law.
Training data is not an “LLM does it”; first because “it” here is not “learning” or understanding in human sense (otherwise you would have to presume that an LLM is a human), and second because a software tool doesn’t have agency and it’s really just Microsoft using a tool based on copyrighted works to generate profit.