Hacker News new | past | comments | ask | show | jobs | submit login

That experience is heavily subsidized and is unprofitable for these companies providing it based on what we know. Even with all of the other developers who are also using the same work flow and espousing how great it is. Even with all of the monthly subscribers at various tiers. It has been unprofitable for several years and continues to be unprofitable and will likely remain unprofitable given current trends.

The author spends a good amount of bytes telling us that they don't want to hear this argument even though they expect it.




I think these types of arguments need to at the very least acknowledge the distribution of cost between training and inference.


Perhaps, and the externalities often unaccounted for or hand-waved away.

Even the US Government is getting involved in subsidizing these companies and all of the infrastructure and resources needed to keep it expanding. We can look forward to even more methane power plants, more drilling, more fracking, more noisy data-centres sucking up fresh water from local reserves and increased damage to the environment that will come out of the pocket books of... ?

Update: And for what? "Deep Research"? Apparently it's not that great or world-changing for the costs involved. It seems that the author is tired of the yearly promise that everything is just a year or two away as long as we keep shovelling more money and resources into the furnace.


Inference isn’t that expensive. A single junior dev costs orders of magnitude more than the amount of inference I use. Companies in growth mode don’t have to make money, it’s a land grab right now. But the expense is largely in the R and D. You can build a rig to run full models for 10-20k right? That’s only a month or two of a junior dev’s time, and after that it’s just electricity. And you could have dozens of devs using the same rig as long as they could timeshare. I don’t see where the economics wouldn’t work, it’s just there’s no use in investing in the hardware until we know where AI is going.


Yeah, you can build a rig to run full models for 10-20k... That's a big reason OpenAI might not make it. The whole article is about LLMs not being a viable business.


It is unprofitable because they keep spending money developing new AI. Inference for existing AI is not unprofitable.


For now.

Unless closed models have significant advantage AI inference will be a commodity business - like server hosting.

I'm not sure that closed models will maintain an advantage.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: