This is the big thing that needs to be addressed. These models are nothing without that data. Code, art, music, just plain old conversations freely given for the benefit or entertainment of other humans.
Humans are accountable for what they use or "borrow."
These models seemingly are getting a free pass through the legal system.
Another way of looking at this is that humans have to have some type of credential for many professions, and they have to pay to be taught and certified. Out of their own pocket, and time.
Not only will these models copy cat your work, but a lot of companies/industries seem very keen on just ignoring the fact that these models have not had to pass any sort of exam.
The software has more rights and privilege than actual humans at this point.
> The software has more rights and privilege than actual humans at this point.
That's been true for some time though, right? For example if you have a community notice board in front of your store and someone pins illegal content to it you're held to a different legal standard than if someone posts the same content to a social media platform.
I don’t think that’s right either, but this kind of “tech exceptionalism” has been baked into law for decades. AI is inheriting those privileges more than inventing new ones.
This is the big thing that needs to be addressed. These models are nothing without that data. Code, art, music, just plain old conversations freely given for the benefit or entertainment of other humans.
Humans are accountable for what they use or "borrow."
These models seemingly are getting a free pass through the legal system.
Another way of looking at this is that humans have to have some type of credential for many professions, and they have to pay to be taught and certified. Out of their own pocket, and time.
Not only will these models copy cat your work, but a lot of companies/industries seem very keen on just ignoring the fact that these models have not had to pass any sort of exam.
The software has more rights and privilege than actual humans at this point.