Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The idea that machine learning like large language models and image generating systems exploit labor might be up for debate, but the fact that they are disproportionately damaging to the environment compared to the alternatives is certainly true in the same way that it's true for Bitcoin mining. And there's more than just those two aspects to consider, it's also very much worth considering how the widespread use of such technologies and the integration of them into our economy might change our political, social, and economic landscape, and whether those changes would be good or bad or worth the downsides. I think it's perfectly valid to decide that an emerging technology is not worth the negative changes it will make in society or the downsides that it will bring with it, and reject its use, technological progress is not necessarily inevitable in the way that every new technology must become widespread.


> disproportionately damaging to the environment compared to the alternatives

This is a new one to me. Do you have any source for that? Once a model is trained, it seems pretty obvious that it takes Dall-E vastly less to create an image than a trained artist. I have trouble believing the training costs are really so large as to change the equation back to favoring humans.


Dall-E is usually not an alternative to a trained artist, but an alternative to downloading a stock image from the internet, which takes way less energy.


AI generated images have already won numerous awards. They can easily make assets good enough for an indie video game. Even stock images have to come from somewhere


Jevon's paradox though? Planes are so much more efficient now than the first prototypes, yet usage is so much higher that resource consumption due to airplanes have vastly increased. Same goes with generative models.


I'm not sure your premise even makes any sense here, because it doesn't take an artist much more resources to produce art then it took them to just exist for the same amount of time. They're still just eating, sleeping, making basic usage of the computer, using heating and light, and so on either way. Whereas someone using dall-e is doing all of that plus relying on the immense training costs of the artificial intelligence. That basic usage of the computer in order to use the machine learning model might be shorter than the basic use of the computer to use procreate or something, but they'll still be using the computer for about the same amount of time anyway, because the time not spent not making art will just be shifted over to other things. So it doesn't seem to me like having machine learning models do something for you instead of learning a skill and doing it yourself will really decrease emissions or energy usage noticeably at all.

Furthermore, even if there is some decrease in emissions using pre-trained machine learning models over using your own skills and labor, the energy costs of training a powerful machine learning model like you're thinking of are way higher than I think you are imagining. The energy and carbon emission cost of training even a 213M parameter transformer for 3.5 days is 626 times the cost of an average human existing for an entire year according to [this study](https://arxiv.org/abs/1906.02243). Does using a pre-trained machine learning model take that much emission out of people's lives? Or a day's worth out of 228,490 lives, perhaps? I doubt it.

But we aren't even using such a small transformers anymore either — they actually aren't that useful. We're using massive models lile GPT-4, and pushing as hard as we can to scale models even further in a cargo cult faith that making them bigger will fundamentally qualitatively shift their capabilities at some point.

So what does the emissions picture look like for GPT-4? The study above found that emissions costs scale linearly with number of parameters and tuning steps as well as training time, so we can make a back of the napkin estimate that GPT-4 is 8,592,480 times more expensive to train than the transformer used in the study, since it is rumored to have 1.76 trillion parameters versus the 213 million of the model in the study, and GPT-3 was said to take 3640 days to train (despite using insane amounts of simultaneous compute to scale the compute up in conjunction with the scale of the model) versus 3.5 days. This in turn means it is 5,378,892,480 times more expensive to train a GPT-4 than it is for a human to live for one year. And again, to reiterate, no matter what work the humans are doing, they're going to be living for around the same amount of time and using roughly the same amount of carbon emissions as long as they're not like taking cross country or transatlantic flights or something. So it's more expensive to train gpt4 then it is for almost 6 billion people to live for a year. I don't think it's taking a year's worth of emissions off of 6 billion people's lives by being slightly more convenient than having to type some things in or draw some art yourself. And there are only 8 billion people on the planet, so I don't think there's enough people to spread smaller gains out across to justify the training of this model (you'd have to take a days worth of emissions off of 1,963,295,755,200 people to offset that training cost!), especially since in my opinion the decrease in emissions of using machine learning models would necessarily be absolutely miniscule.


This back of the napkin estimate for GPT-4 emissions costs is too high by orders of magnitude. Your estimate is that training it emitted about as much as CO2 as 5.38 billion average humans living their lives for a year did. With a world population of 8 billion, it would mean that GPT-4 training was equivalent to 0.67 years of total anthropogenic CO2 emissions. Since GPT-4 CO2 emissions all come from manufacturing hardware with fossil fuels or burning fossil fuels for electricity, this is roughly equivalent to 0.67 years of global fossil fuel production.

But OpenAI had neither the money nor the physical footprint to consume 0.67 years' worth of global fossil fuel production! At those gargantuan numbers OpenAI would have consumed more energy than the rest of the world combined while training GPT-4. It would have spent trillions of dollars on training. It would have had to build more data centers than previously existed in the entire world just to soak up that much electricity with GPUs.


That's a good point, that's what I get for doing a linear extrapolation. This looks like a better estimation, which doesn't look good for my argument: https://towardsdatascience.com/the-carbon-footprint-of-gpt-4...

I still think my point about imagining that using ML models decreases emissions versus a human doing the same task still stands though — humans don't produce that much more or less emissions depending on what task they're doing, and they'll be existing either way, and probably using the computer the same amount either way, just not spending aa much time on that one task, so I don't see how you can argue using an ML model to write or draw something uses less CO2 than a human doing it. You can't count the amount of CO2 the human takes to exist for the amount of time it takes for them to do the task as the CO2 cost of the human doing the task because humans don't stop existing and taking up resources if they're not doing a task unlike programs. And you can't really compare the power used to run the ML model to the power used by the computer the human is using during the time it takes them to do the task either, since the human will need to use the computer to access your ML model, interact with it to define the prompt, edit the results, etc (and also bc again they'll probably just shift any time saved doing that task on the computer to another task on the computer). Additionally of course there's the fact that you can't really use large language models to replace writers or machine learning image generation tools to replace artists if you actually care about the quality of the work.


Huge kudos for admitting this changes your reasoning - I don't see people willing to admit that often, especially on the internet.


Thank you! It would have been silly for me to deny that my math was off, I don't really know how I would have rhetorically done that lol. I did find another relevant link on this topic for consideration, though, after writing my above comment: https://futurism.com/the-byte/ai-electricity-use-spiking-pow.... According to that article, although large language models are not yet be drawing as much power as I calculated (so my linear extrapolation was still silly), apparently they might eventually do so (0.5% of the world's energy by 2027). The actual study is paywalled though so I don't really know what their methodology is and they may well be doing the same linear extrapolation thing I was doing above, so I'm not really sure how seriously we should take this. It's something to consider though when we weigh the costs and benefits.


I would argue that most social progress comes from automating a task and freeing humans up to do something else - your logic counts just as solidly against building a car in a factory, or using a sewing machine, or a thousand other socially acceptable things. Surely the "LLM Revolution" isn't worse than the Industrial Revolution was?


Nothing I said was about automation per se being bad? I'm not sure where you got that from. I was specifically talking about the relative carbon emissions of machine learning models doing something versus human beings doing something, and that the former doesn't have an advantage over the latter in emissions in my opinion. I don't think that really applies to automation in general, because I wasn't really making a point about automation, I was just making a point about the relative emissions of two ways of automating something. I actually agree with you that in principle automation is not a bad thing, and that economies can eventually adjust to it in the long run and even be much better off for it, although we would probably disagree on some things, since I think our current economic system has a tendency to use automation to increase inequality and centralized power and resources in corporations and the rich as opposed to truly benefiting everyone, because those with economic power are going to be the ones owning the automation and using it to their advantage, while making the average person useless to them and not directly benefitting us. But that's an entirely different discussion really.


It’s so ironic that you have this stance about the value of other people but you feel so humiliated by OP as to think they’re bullying.


I think you replied to the wrong post?


how's it more damaging to the environment of you can replace 1k people, that's 1k people staying at home instead of commuting, sure that causes pain if we can't figure out ubi or a way to house and feed the masses, also many of the biggest ai users are working to get their energy 100 percent from solar, wind, and geothermal. AI is something we've been heading towards since the dawn of man.

Hell, ancient Rome had automatons. There's no way to stop it. Ideally we merge with the ai to become something else than give it super powers and it decides to destroy us. I'm not sure the benevolent care giver of humanity is something we can hope for.

It's a scary but interesting future, but I mean we've also got major problems like cancer, global warming, etc, and ai is a killer researcher, that did 300k years worth of human research hours in a month to find tons of materials that can possibly be used by industry.

They're doing similar with medicine, etc... there's many pros and negatives, I'm a bit of an accelerationist, rip the band-aid off kind of guy, everyone dies someday I guess, not everyone can say they were killed by a Terminator, well at least not yet lol, tongue in cheek.


> I'm a bit of an accelerationist, rip the band-aid off kind of guy, everyone dies someday I guess

Are you volunteering to go first?


> how's it more damaging to the environment of you can replace 1k people, that's 1k people staying at home instead of commuting,

Check my comment above, where I do some rough back of the napkin calculations around this. Training gpt4 for example produced around 6 billion times the carbon emissions a human emits in total in a year, which probably includes commuting, so unless gpt4 removes the commute time of probably a significant fraction more than 6 billion people (since it wouldn't be eliminating their emissions entirely, just their commuting emissions) it is a net loss. Also, we can eliminate commute emissions by having better public transportation and walkable/bikable cities, we don't need to prostrate ourselves before a dementia addled machine God to get there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: