Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I keep seeing "AI ethics" being redefined to focus on fictional problems instead of real-world ones, so I wrote a little post on it.


Great little post. Congrats.

Also there's the ethics of scraping the whole internet and claiming that it's all fair use, because the other scenario is a little too inconvenient for all the companies involved.

P.S.: I expect a small thread telling me that it's indeed fair use, because models "learn and understand just like humans", and "models are hugely transformative" (even though some licenses say "no derivatives whatsoever"), "they are doing something amazing so they need no permission", and I'm just being naive.


I'm a radicalized intellectual property abolitionist. The ethical issue with scraping is the DDoS-like nature it has on smaller sites and running up the bandwith bill for medium hosts. There's no individual compnay at fault for the flood. Rather, it's an emergent result of each startup attempting to train data that's ever so slightly more up-to-date or broad than its competitors. If they shared a common corpus that updated once per month, scraping traffic would be buried in organic human visitors instead of the other way around. Let them compete on training methodology, not a race for scraping.


Worrying about that stuff is just a waste of time. Not because of what you said, but because it's all ultimately pointless.

Unless you believe this will kill AI, all it does is to create a bunch of data brokers.

Once fees are paid, data is exchanged, and models are trained, if the AI takes your job of programming/drawing/music, then it still does. We arrived at the same destination, only with more lawyers in the mix. You get to enjoy unemployment only knowing that lawyers made sure that at least they didn't touch your cat photos.


The thing is, if you can make sure that some of that your images/music/code aren't used for AI training, then you can be sure that you can continue doing what you do, because your personal style enables the specialty you can create.

Maybe you will lose some of your "territory" in the process, but what makes you, you will be preserved. Nobody will be able to ask "draw me a comic with these dialogue in the style of $ARTIST$".


> The thing is, if you can make sure that some of that your images/music/code aren't used for AI training, then you can be sure that you can continue doing what you do, because your personal style enables the specialty you can create.

Personal styles are dime a dozen and of far lesser importance than you think.

Professionals will draw in any style, that's how we make things like games and animated movies. Even assuming you had some unique and incredibly valuable style, all it'd take to copy it completely legally is finding somebody else willing to copy your style to provide training material, and train on that.


> Personal styles are dime a dozen and of far lesser importance than you think.

Try imitating Mickey Mouse, Dilbert, Star Wars, Hello Kitty, XKCD, you name it.

Randall will possibly laugh at you, but a legal company which happens to draw cartoons won't be amused and come after you in any way they can.

> Professionals will draw in any style...

Yep, after calling and getting permission and possibly paying some fees to you if you want. There's respect and dignity in this process.

Yet, we reduce everything into money. Treating machine code like humans and humans like coin-operated vending machines.

There's something wrong here.


> Try imitating Mickey Mouse, Dilbert, Star Wars, Hello Kitty, XKCD, you name it.

Those are not styles, they're characters for the most part.

You absolutely can draw heavy inspiration from existing properties, mostly so long you avoid touching the actual characters. Like D&D has a lot of Tolkien in it, and I believe the estate is quite litigious. You can't put Elrond in a D&D game, but you absolutely can have "Elf" as a species that looks nigh identical to Tolkien's descriptions.

For style imitation, it's long been a thing to make more anime-ish animation in the west, and anime itself came from Disney.

> Yep, after calling and getting permission and possibly paying some fees to you if you want.

Not for art styles, they won't. Style is not copyrightable.


> Those are not styles, they're characters for the most part. (Emphasis mine)

While I know that styles are not copyrightable for good-faith reasons, massive abuse of good-faith is a good siren for regulation in that area.

> You absolutely can draw heavy inspiration from existing properties, mostly so long you avoid touching the actual characters.

From what I understood, it's mostly allowed for homage and (un)intentional narrowing of creative landscape. Not for ripping people off.

> For style imitation, it's long been a thing to make more anime-ish animation in the west, and anime itself came from Disney.

But all are done in tradition of cross-pollination, there was no ill-intentions, until now.

After OpenAI ripped Studio Ghibli, and things got blurred. It's not my interpretation, either [0] [1].

Then there's Universal and Disney's lawsuits against Midjourney.While these are framed as character-copying, when you read between the lines, style appropriation is also something being strongly balked at [2].

So things are not as clear cut as before, because a company stepped on the toes of another one. Small fish might get some benefits as a side-effect.

Addenda: Even OpenAI power-walked away from mocking Studio Ghibli to "maybe we shouldn't do that" [3].

[0]: https://www.theatlantic.com/technology/archive/2025/05/opena...

[1]: https://futurism.com/lawyer-studio-ghibli-legal-action-opena...

[2]: https://variety.com/vip/how-the-midjourney-lawsuit-impacts-g...

[3]: https://www.eweek.com/news/openai-studio-ghibli-ai-art-copyr...


> While I know that styles are not copyrightable for good-faith reasons, massive abuse of good-faith is a good siren for regulation in that area.

Nothing having to do with "good faith", but that style isn't really definable. There's thousands of artists that produce very similar outputs.

Also it'd be very stupid, because suddenly it'd turn out that if there's two people that draw nearly identically, one could sue the other even if that happened by chance.

> After OpenAI ripped Studio Ghibli, and things got blurred.

Nothing blurry about it. OpenAI is within full legal right to do it. It's kinda in bad taste, that's about it. Anyone can do it. Disney could make a Ghibli style movie if they ever wanted to.

I'm not sure why all the drama, because who even cares? The reason why I watched Ghibli movies wasn't ever about the particular looks.

> Then there's Universal and Disney's lawsuits against Midjourney.While these are framed as character-copying, when you read between the lines, style appropriation is also something being strongly balked at

You better hope it stays at characters, or we're going to have a mess of lawsuits of people and organizations suing each other because they draw eyebrows this particular way. I fail to see why is that at all desirable.

And of course the big corporations will come on top of that.


> Nothing having to do with "good faith", but that style isn't really definable.

We have something called AI which knows everything, maybe they should ask them. It's very fashionable. Even if the definition is wrong, it's an AI, it can make no wrong. That's what I've heard.

> I'm not sure why all the drama, because who even cares? The reason why I watched Ghibli movies wasn't ever about the particular looks.

Because a man and a studio which draws their movies by hand [0], frame by frame, and spend literal years doing it for a single movie deserves some respect even if you don't care about the art style.

Even a top notch studio like Pixar can pump a couple of minutes per week [1].

Doing this type of work takes immense dedication, energy and time. If you think it's worthy of nothing, I can't say anything about it. I deeply respect these people for what they do, and I'm equally thankful.

> You better hope it stays at characters, or we're going to have a mess of lawsuits of people and organizations suing each other because they draw eyebrows this particular way. I fail to see why is that at all desirable.

Maybe they should drink their own poison to understand what kind of delicate balances they're poking and prodding. The desire for more monies in spite of everything should have some consequences.

[0]: https://www.reddit.com/r/nextfuckinglevel/comments/1egdzja/t...

[1]: https://www.reddit.com/r/todayilearned/comments/8p71cb/til_i...


Sometimes AI is "just like a human", other times AI is "just a machine".

It all depends on what is most convenient for avoiding any accountability.


IP is a pragmatic legal fiction, created to reward developers of creative and innovative thought, so we get more of it. It’s not a natural law.

As such fair use is whatever the courts say it is.


Then let's abolish all of them. Patents, copyrights, anything. Let's mail Getty, Elsevier, car manufacturers, chemical plants, software development giants and small startups that everything they have has no protection whatsoever...

Let us hear what they think...

I'm for the small fish here, people who put things out because of pure enjoyment, waiting nothing but a little respect for the legal documents they attach to their wares they made meticulously, which enables most of the infra which enables you to read this very comment, for example.

Current model rips the small fish and feeds the bigger one forcefully, creates an inequality. There are two ways to stop this. Bigger fish will respect smaller fish, because everybody is equal in front of law (which will not happen) or abolishing all protections and make bigger fish vulnerable to small fish (again, which will not happen).

Incidentally, I'm also here for the bigger fish, too, which put their wares in source-available, "look but not use" type of licenses. They are also hosed equally badly.

I see the first one as a more viable alternative, but alas...

P.S.: Your comment gets two points. One for deflection (it's not natural law argument), and another one for "but it's fair use!" clause. If we argue that only natural laws are laws, we'll have some serious fun.


> Then let's abolish all of them. Patents, copyrights, anything.

This, but without the irony. Let us be like bacteria, freely swapping plasmids.


Thanks! Yeah, there's a lot of "well, it's 'standard practice' now so it can't be wrong" going on in so many different ways here too...


Yes, all this highly public hand-wringing about "alignment" framed in terms of "but if our AI becomes God, will it be nice to us" is annoying. It feels like it's mostly a combination of things. Firstly, by play-acting that your model could become God, you install FOMO in investors who see themselves not being on the hyper-lucrative "we literally own God as ascend to become its archangels" boat. You look like you're taking ethics seriously and that deflects regulatory and media interest. And, it's a bit of fun sci-fi self-pleasure for the true believers.

What the deflection is away from is that the actual business plan here is the same one tech has been doing for a decade: welding every flow and store of data in the world to their pipelines, mining every scrap of information that passes through and giving themselves the ability to shape the global information landscape, and then sell that ability to the highest bidders.

The difference with "AI" is that they finally have a way to convince people to hand over all the data.


It's interesting how I think our experience differs completely, for example, regarding people's concerns for AI ethics you write:

>People are far more concerned with the real-world implications of ethics: governance structures, accountability, how their data is used, jobs being lost, etc. In other words, they’re not so worried about whether their models will swear or philosophically handle the trolley problem so much as, you know, reality. What happens with the humans running the models? Their influx of power and resources? How will they hurt or harm society?

This is just not my experience at all. People do worry about how models act because they infer that eventually they will be used as source of truth and because they already get used as source of action. People worry about racial makeup in certain historical contexts[1], people worry when Grok starts spouting Nazi stuff (hopefuly I don't need a citation for that one) because they take it as a sign of bias in a system with real world impact, that if ChatGPT happens to doubt the holocaust tomorrow, when little Jimmy asks it for help in an essay he will find a whole lot of white supremacist propaganda. I don't think any of this is fictional.

I find the same issue with the privacy section. Yes concerns about privacy are primarily about sharing that data, precisely because controlling how that data is shared is a first, necessary step towards being able to control what is done with the data. In a world in which my data is taken and shared freely I don't have any control on what is done with that data because I have no control on who has it in the first place.

[1] https://www.theguardian.com/technology/2024/mar/08/we-defini...


Thanks for the perspective. For me I think it's a matter of degree (I guess I was a bit "one or the other" when I wrote it).

These things are also concerns and definitely shouldn't be dismissed entirely (especially things like AI telling you when it's unsure, or, the worse cases of propaganda), but I'm worried about the other stuff I mention being defined away entirely, the same way I think it has been with privacy. Tons more to say on the difference between "how you use" vs "how you share" but good perspective, and interesting that you see the emphasis differently in your experiences.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: