I'm not sure your premise even makes any sense here, because it doesn't take an artist much more resources to produce art then it took them to just exist for the same amount of time. They're still just eating, sleeping, making basic usage of the computer, using heating and light, and so on either way. Whereas someone using dall-e is doing all of that plus relying on the immense training costs of the artificial intelligence. That basic usage of the computer in order to use the machine learning model might be shorter than the basic use of the computer to use procreate or something, but they'll still be using the computer for about the same amount of time anyway, because the time not spent not making art will just be shifted over to other things. So it doesn't seem to me like having machine learning models do something for you instead of learning a skill and doing it yourself will really decrease emissions or energy usage noticeably at all.
Furthermore, even if there is some decrease in emissions using pre-trained machine learning models over using your own skills and labor, the energy costs of training a powerful machine learning model like you're thinking of are way higher than I think you are imagining. The energy and carbon emission cost of training even a 213M parameter transformer for 3.5 days is 626 times the cost of an average human existing for an entire year according to [this study](https://arxiv.org/abs/1906.02243). Does using a pre-trained machine learning model take that much emission out of people's lives? Or a day's worth out of 228,490 lives, perhaps? I doubt it.
But we aren't even using such a small transformers anymore either — they actually aren't that useful. We're using massive models lile GPT-4, and pushing as hard as we can to scale models even further in a cargo cult faith that making them bigger will fundamentally qualitatively shift their capabilities at some point.
So what does the emissions picture look like for GPT-4? The study above found that emissions costs scale linearly with number of parameters and tuning steps as well as training time, so we can make a back of the napkin estimate that GPT-4 is 8,592,480 times more expensive to train than the transformer used in the study, since it is rumored to have 1.76 trillion parameters versus the 213 million of the model in the study, and GPT-3 was said to take 3640 days to train (despite using insane amounts of simultaneous compute to scale the compute up in conjunction with the scale of the model) versus 3.5 days. This in turn means it is 5,378,892,480 times more expensive to train a GPT-4 than it is for a human to live for one year. And again, to reiterate, no matter what work the humans are doing, they're going to be living for around the same amount of time and using roughly the same amount of carbon emissions as long as they're not like taking cross country or transatlantic flights or something. So it's more expensive to train gpt4 then it is for almost 6 billion people to live for a year. I don't think it's taking a year's worth of emissions off of 6 billion people's lives by being slightly more convenient than having to type some things in or draw some art yourself. And there are only 8 billion people on the planet, so I don't think there's enough people to spread smaller gains out across to justify the training of this model (you'd have to take a days worth of emissions off of 1,963,295,755,200 people to offset that training cost!), especially since in my opinion the decrease in emissions of using machine learning models would necessarily be absolutely miniscule.
This back of the napkin estimate for GPT-4 emissions costs is too high by orders of magnitude. Your estimate is that training it emitted about as much as CO2 as 5.38 billion average humans living their lives for a year did. With a world population of 8 billion, it would mean that GPT-4 training was equivalent to 0.67 years of total anthropogenic CO2 emissions. Since GPT-4 CO2 emissions all come from manufacturing hardware with fossil fuels or burning fossil fuels for electricity, this is roughly equivalent to 0.67 years of global fossil fuel production.
But OpenAI had neither the money nor the physical footprint to consume 0.67 years' worth of global fossil fuel production! At those gargantuan numbers OpenAI would have consumed more energy than the rest of the world combined while training GPT-4. It would have spent trillions of dollars on training. It would have had to build more data centers than previously existed in the entire world just to soak up that much electricity with GPUs.
I still think my point about imagining that using ML models decreases emissions versus a human doing the same task still stands though — humans don't produce that much more or less emissions depending on what task they're doing, and they'll be existing either way, and probably using the computer the same amount either way, just not spending aa much time on that one task, so I don't see how you can argue using an ML model to write or draw something uses less CO2 than a human doing it. You can't count the amount of CO2 the human takes to exist for the amount of time it takes for them to do the task as the CO2 cost of the human doing the task because humans don't stop existing and taking up resources if they're not doing a task unlike programs. And you can't really compare the power used to run the ML model to the power used by the computer the human is using during the time it takes them to do the task either, since the human will need to use the computer to access your ML model, interact with it to define the prompt, edit the results, etc (and also bc again they'll probably just shift any time saved doing that task on the computer to another task on the computer). Additionally of course there's the fact that you can't really use large language models to replace writers or machine learning image generation tools to replace artists if you actually care about the quality of the work.
Thank you! It would have been silly for me to deny that my math was off, I don't really know how I would have rhetorically done that lol. I did find another relevant link on this topic for consideration, though, after writing my above comment: https://futurism.com/the-byte/ai-electricity-use-spiking-pow.... According to that article, although large language models are not yet be drawing as much power as I calculated (so my linear extrapolation was still silly), apparently they might eventually do so (0.5% of the world's energy by 2027). The actual study is paywalled though so I don't really know what their methodology is and they may well be doing the same linear extrapolation thing I was doing above, so I'm not really sure how seriously we should take this. It's something to consider though when we weigh the costs and benefits.
I would argue that most social progress comes from automating a task and freeing humans up to do something else - your logic counts just as solidly against building a car in a factory, or using a sewing machine, or a thousand other socially acceptable things. Surely the "LLM Revolution" isn't worse than the Industrial Revolution was?
Nothing I said was about automation per se being bad? I'm not sure where you got that from. I was specifically talking about the relative carbon emissions of machine learning models doing something versus human beings doing something, and that the former doesn't have an advantage over the latter in emissions in my opinion. I don't think that really applies to automation in general, because I wasn't really making a point about automation, I was just making a point about the relative emissions of two ways of automating something. I actually agree with you that in principle automation is not a bad thing, and that economies can eventually adjust to it in the long run and even be much better off for it, although we would probably disagree on some things, since I think our current economic system has a tendency to use automation to increase inequality and centralized power and resources in corporations and the rich as opposed to truly benefiting everyone, because those with economic power are going to be the ones owning the automation and using it to their advantage, while making the average person useless to them and not directly benefitting us. But that's an entirely different discussion really.
Furthermore, even if there is some decrease in emissions using pre-trained machine learning models over using your own skills and labor, the energy costs of training a powerful machine learning model like you're thinking of are way higher than I think you are imagining. The energy and carbon emission cost of training even a 213M parameter transformer for 3.5 days is 626 times the cost of an average human existing for an entire year according to [this study](https://arxiv.org/abs/1906.02243). Does using a pre-trained machine learning model take that much emission out of people's lives? Or a day's worth out of 228,490 lives, perhaps? I doubt it.
But we aren't even using such a small transformers anymore either — they actually aren't that useful. We're using massive models lile GPT-4, and pushing as hard as we can to scale models even further in a cargo cult faith that making them bigger will fundamentally qualitatively shift their capabilities at some point.
So what does the emissions picture look like for GPT-4? The study above found that emissions costs scale linearly with number of parameters and tuning steps as well as training time, so we can make a back of the napkin estimate that GPT-4 is 8,592,480 times more expensive to train than the transformer used in the study, since it is rumored to have 1.76 trillion parameters versus the 213 million of the model in the study, and GPT-3 was said to take 3640 days to train (despite using insane amounts of simultaneous compute to scale the compute up in conjunction with the scale of the model) versus 3.5 days. This in turn means it is 5,378,892,480 times more expensive to train a GPT-4 than it is for a human to live for one year. And again, to reiterate, no matter what work the humans are doing, they're going to be living for around the same amount of time and using roughly the same amount of carbon emissions as long as they're not like taking cross country or transatlantic flights or something. So it's more expensive to train gpt4 then it is for almost 6 billion people to live for a year. I don't think it's taking a year's worth of emissions off of 6 billion people's lives by being slightly more convenient than having to type some things in or draw some art yourself. And there are only 8 billion people on the planet, so I don't think there's enough people to spread smaller gains out across to justify the training of this model (you'd have to take a days worth of emissions off of 1,963,295,755,200 people to offset that training cost!), especially since in my opinion the decrease in emissions of using machine learning models would necessarily be absolutely miniscule.