Hacker Newsnew | past | comments | ask | show | jobs | submit | dimitrios1's commentslogin

Discussing the subject without reactionary political takes is more valuable.

> sound scary when presented without context

It's not about it being scary, its about it being a gigantic, stupid waste of water, and for what? So that lazy executives and managers can generate their shitty emails they used to have their comms person write for them, so that students can cheat on their homework, or so degens can generate a video of MLK dancing to rap? Because thats the majority of the common usage at this point and creating the demand for all these datacenters. If it was just for us devs and researchers, you wouldn't need this many.


Whether it's a "gigantic" waste of water depends on what those figures mean. It's very important to understand if 25 million liters of water per year is a gigantic number or not.


For comparison it's about 10 olympic-sized swimming pools worth of water, doesn't seem very significant to me. Unless you're going to tell people they're not allowed swimming pools any more because swimming doesn't produce enough utility?

And at any rate, water doesn't get used up! It evaporates and returns to the sky to rain down again somewhere else, it's the most renewable resource in the entire world.


If only millions of people suffering from lack of water knew this.


Would we be sending that water to those millions of people instead?


If you redistributed this water to a million people suffering from lack of water, they'ed get about 2 shot glasses worth per day.


Yea but it's not like those people have never seen water. And yet it's not so simple, that you can use water but the water eventually comes back to you. There is a hell lot more nuance to this.


Its not gigantic and its not a waste. Brainrot creates massive economic value that can be used to pay people for products you are more happy to consume.


And also, none of those current use cases are a real benefit to society, outside of maybe research cases.

The only benefit is to the already wealthy owner class that is itching to not have to pay for employees anymore because it impacts their bottom line (payroll is typically the largest expense).

It's not like we are making robots to automate agriculture and manufacturing to move toward a post scarcity, moneyless society, which would have real benefits. No, instead we have AI companies hyping up a product whose purpose (according to them) is so that already wealthy people can hoard more wealth and not have to pay for employees. It's promising to take away a large portion of the only high-paying jobs we have left for the average person without an advanced degree.

Me being able to write software a little faster, without hiring a junior, is a net negative to society rather than a benefit.


You appear to be arguing against using technology to boost human efficiency on a forum full of software engineers who've dedicated their careers to building software that makes humans more efficient.

If we aren't doing that then why are we building software?


Because the stated goal of generative AI is not to make an individual more efficient, it's to replace that individual all together and completely eliminate the bottom rungs of the professional career ladder.

Historically software that made humans more efficient resulted in empowerment for the individual, and also created a need for new skilled roles. Efficiency gains were reinvested into the labor market. More people could enter into higher paying work.

With generative AI, if these companies achieve their stated goals, what happens to the wealth generated by the efficiency?

If we automate agriculture and manufacturing, the gain is distributed as post-scarciaty wealth to everyone.

If we automate the last few remaining white-collar jobs that pay a living wage, the gain is captured entirely by the capital owners & investors via elimination of payroll, while society only loses one of its last high-paying ladders for upward mobility.

Nobody lost their career because we built a faster operating system or a better compiler. With generative AI's stated goals, any efficiency gains are exclusively for those at the very top, while everyone else gets screwed.

Now, I'll concede and say, that's not the AI companies' fault. I'm not saying we shouldn't magically stop developing this technology, but we absolutely need our governments to start thinking about the ramifications it can have and start seriously considering things like UBI to be prepared for when the bottom falls out of the labor market.


Thanks, that's a well argued comment.

I'm not a fan of of the "replace workers with AI" thing myself - I'm much more excited about AI as augmentation for existing workers so they can take on more challenging tasks.


Does the future productivity growth that would have been gained later (due to more junior engineers not entering the field) outweigh the AI gains?

If it's just the little productivity boost now, I think it's a net negative if hiring trends continue.

I think it's a discussion to be had but talent pool is a tragedy of the commons situation.


Seems the problem is the revealed preference of the normies, rather than the technology itself.


One thing I can say definitively, as someone who is definitely not an AI zealot (more of an AI pragmatist): GPT language models have reduced the barrier of running your own bare metal server. AWS salesfolk have long often used the boogeyman of the costs (opportunity, actual, maintenance) of running your own server as the reason you should pick AWS (not realizing you are trading one set of boogeymen for another), but AI has reduced a lot of that burden.


That's more of a form of survivorship bias. Microsoft continued to maintain its lockdown on government IT and infrastructure through the decades, over the alternatives.


Life in the fallen world is indeed dark, and certainly was darker a mere few generations ago. The difference is we have lost the frameworks generations past used for dealing with major depressive episodes, and have opted for more "enlightened" approaches that are clearly working /s


There is a whole 'nother level of safety validation that goes beyond your everyday OWASP, or heck even what we consider "highly regulated" industry requirements that 95-99% of us devs care about. SQLite is used in some highly specialized, highly sensitive environments, where they are concerned about bit flips, and corrupted memory. I had the luxury of sitting through Richard Hipp's talk about it one time, but I am certainly butchering it.


I have noticed that it coincides with the re-election of a certain political candidate (He who must not be named).

The facade of "critical and rational thinker" has all but completely fallen away and this place has revealed itself for the true ideological echo chamber that it is.


By accepting the fact that sometimes (many times) you won't get the outcome you desire, in the manner of which you desire it.


This is the uniparty at work.


So then removing the subsidies shouldn't be an issue?


It's not just subsidies, it's also permitting and any other roadblocks they can manage. This isn't just economic or political, it's a weird personal crusade.


If it was only the subsidies, but it's not.

https://www.reuters.com/legal/litigation/us-orders-orsted-ha...


It shouldn’t. Trump is using environmental regulation to block projects. It’s crazy seeing the GOP embrace San Francisco’s last decade of policy.


"'I never thought leopards would eat MY face,' sobs woman who voted for the Leopards Eating People's Faces Party."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: