Hacker Newsnew | past | comments | ask | show | jobs | submit | more andrewstuart's commentslogin

Why are people still using AWS?

And then writing “I regret it” posts that end up on HN.

Why are people not getting the message to not use AWS?

There’s SO MANY other faster cheaper less complex more reliable options but people continue to use AWS. It makes no sense.


Examples?


Of what?


> faster cheaper less complex more reliable options


Allow me to google that for you…..

https://www.ionos.com/servers/cloud-vps

$22/month for 18 months with a 3-year term 12 vCores CPU 24 GB RAM 720 GB NVMe

Unlimited 1Gbps traffic


AWS is not just EC2

And even EC2 is not just a VPS

If you need a simple VPS, yes, by all means, don't use AWS.

For this usecase AWS is definitely not cheaper nor simpler. Nobody said that. Ever.


They’re Linux computers.

Anything AWS does you can run on Linux computers.

It’s naive to think that AWS is some sort of magically special system that transcends other networked computers, out of brand loyalty.

That’s the AWS kool aid that makes otherwise clever people think there’s no way any organization can run their own computer systems - only AWS has the skills for that.


It was already clear that you were in bad faith here when you suggested a VPS to replace AWS, no need to insist.

But you are absolutely right, I'm drinking the AWS kool aid like thousands of other otherwise clever people who don't know that AWS is just Linux computers!


Good luck managing the whole day-2 operations and the application layer on top of your VPS. You're just shuffling around your spending. For you it's not on compute anymore but manpower to manage that mess.


In theory. Good luck rolling your own version of S3.


You probably don't need it. I see so many people getting price gouged by S3 when it would be orders of magnitude cheaper to just throw the files on a basic HTTP server.

I sometimes feel bad using people's services built with S3 as I know my personal usage is costing them a lot of money despite paying them nothing.


A web server isn’t a storage solution. And a storage solution like S3 isn’t a delivery network. If you use the wrong tool expect problems.


A web storage is connected to storage solutions like SSDs and S3 is connected to delivery networks like Internet. Using SSDs to store files or Internet to send files to a user are not the wrong tools.


Must be exhausting to have to explicitly learn all that.


I don't know whether author is on the spectrum, but for many people on it, it feels exactly like this.


The word trillion and AWS in the same sentence makes my wallet tremble in fear.


It makes sense to allow people to license their own voice on a per project basis.

Voices of dead actors? Feels wrong.


Never occurred to me that oracle could go pop.


Yeah, I don't think people are processing just how much money is being spent. AI is a useful tool, but the financial aspects of it are monstrous. In terms of comparison, the OpenAI deals announced this year is bigger than the US defense budget. Nuclear weapons, stealth bombers, nuclear submarines, F-35s, a million soldiers...more money than all of that.


Think what an aircraft carrier costs to build (approx 13 billion USD) and then run per year (approx 0.75-1 billion USD), and then think that Sam Altman has promised 100 times that to all his partners. A floating city of 5000 people that can launch a wing of fighter planes and support craft and turn any enemy city into rubble costs a tiny fraction of supporting a chat bot.


The thing is. Crypto was crap. It was never going to revolutionize the word and generate billions of dollars of value. It’s like a decentralized bank with the good and bad that comes along with that.

AI can actually revolutionize the world. The best outcome of all these bets are bonkers. If one company can land it and everyone else fails, it’s trillions. I’m less optimistic what happens if 10 companies all have AI models with roughly similar performance. How much can they charge?


No company will land it building attention based transformers. They simply don't do what people say they do. They are very powerful and I love using claude code every day at work. But they aren't going to become autonomous androids because they don't even respect what little we know about cognitive science. I think that what is more exciting across all of this work is that many of these models are a strong increment forward in predicting physical systems which means we can have more accurate and more complex control systems for manufacturing, transport, etc. You can use transformers to take a super computer scale weather model and run it on a desktop GPU. That is rather impressive imo.


1) Are some intelligent people with the power over hundreds of millions, billions even, that sure that AI will pay off?

2) Is it all a gamble by not-that-intelligent people that have power over those billions?

3) Is it intelligent people knowing this will never work, just pretending for the grift, ripping off 2)?

It's fascinating and I can't wait to see how it's ending, despite knowing there is no happy end possible anymore. Either most people are replaced by AI and society as we know it ends or there is a gigantic crash waiting.


Sam Bankman-Fried isn't in jail right now because of a lack of IQ.

There are all these social dynamics at play that cause really smart people to do really stupid things sometimes.


It's 3, surely; Even if the companies crash, the people leading them won't - they will have extracted an absurd amount of cash, and when the crash comes they're like "oh we paid ourselves millions for our good work, we couldn't possibly see the crash coming. It's nobodies fault! Anyway let's get our bankruptcy proceedings over already, the company doesn't have remotely as much worth left anyway"


I believe there is lot of making money now or in next 3 to 12 months. Whatever it takes. Unless you go to really egregious level of fraud, you likely won't be punished. So optimal game is to skim your bonuses from the game. Same goes for fund management. Keep numbers looking good and stay on mostly legal side and you get to skim from top.

Eventually this will then crash down.


This whole thing is so repulsive, makes me want to vomit.


I wonder if the way we have to look at the A.I. race is as a form of cold war. During the cold war, military expenses made no economic sense but we had to do it anyway to come on top. At this point, it's "who can borrow the most without bankrupting itself or can survive until a government bailout".



4) the whole system is rigged and the playbook is entering a new chapter


So OpenAI is clearly a military too :)


I wonder. They have powerful friends and lots of dependents still (despite their abusive practices), so god knows how this will go.


Google owns data centers, computers and AI chips - that’s a lot of advantage over Anthropic and OpenAI.


“How dare ffmpeg be so arrogant! Don’t they know who we are? Fork ffmpeg and kill the project! I grant a budget of 30 million to crush this dissent! Open source projects must know who’s boss! I’ll stomp em like a union!”

…. overheard at a meeting of CEO and CTO at generic evil mega tech corp recently.


What’s the point of this?

Seems maybe to be a cloud apologist post - “AWS is inevitable, sure it sucks and is unreliable but that’s the present and future so don’t you dare get to thinking you could make your own redundant systems cause you can’t,”

Seems to be the messsge.


Commercial TTS mostly sucks too.

There’s flashes of brilliance but most of it is noticeably computer generated.


The Gemini models and Eleven V3, and whatever internal audio model Sora 2 uses are about neck and neck in converging performance. They have some unexplainable flavor to them though. Especially Sora.


>> I fucking hate this.

>> And I can not help, but feel dusgust and shame. Is this what programming is now?

I love it. LLM assisted programming lets me do things I would never have been able to do on my own.

Never a greater leap in programming than the LLM.

No doubt the process is messy and uncertain and all about wild goose chases but that’s improving with every new release of the LLMs.

If you understand everything the LLM wrote then you’re holding it wrong.

I don’t hear developers disowning their work because they didn’t write the machine code that the compiler and linker output. LLM assisted programming is no different.

I’m excited about it and can’t wait to see where it all goes.


Compiler and linker output is totally different. Compiler and linker output comes from determnistic, hard-coded logic that you can trust and build on. It's in a totally different category.

Compiler and linker output is like trusting well-proven math theorums written in standardized symbols, published in peer-reviewed books. LLM output is like asking random people on the street for random ideas with ambiguous language based on who knows what.

The jump to LLMs is in no way analogous to the jump to a higher level programming language.


"If you understand everything the LLM wrote then you’re holding it wrong."

If you're using LLM to build production code, then yes, you should very much understand everything the LLM wrote.


Lucky I'm not an SRE!


I've successfully built projects that I had previously abandoned because the amount of tangential learning required felt overwhelming. With AI assistance, I'm now reaching goals that were once just beyond my grasp, which is an incredible feeling.

I think those who dismiss 'vibe coding' haven't tried to use it for something truly beyond their current skill set. There's also an implicit sneer in the criticism—a kind of 'Oh, so you need an AI to help you code?'—that misses the point entirely.

[btw: I used Deepseek to reword my original reply :)]


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: