Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Revenue isn't profit. They're burning money at an impressive rate: https://www.nytimes.com/2024/09/27/technology/openai-chatgpt...

I wouldn't even be surprised if they were losing money on paying ChatGPT users on inference compute alone, and that isn't even factoring in the development of new models.

There was an interesting article here (can't find the link unfortunately) that was arguing that model training costs should be accounted for as operating costs, not investments, since last year's model is essentially a total write-off, and to stay competitive, an AI company needs to continue training newer frontier models essentially continuously.



Training costs scale to infinite users making them a perfect moat even if they need to keep updating it. Success would be 10-100x current users at which point training costs at the current scale just don’t matter.

Really their biggest risk is total compute costs falling too quickly or poor management.


The potential user base seems quite finite to me, even under most optimistic assumptions.


Not everyone is going to pay 20$/month, but an optimistic trajectory is they largely replace search engines while acting as a back end for a huge number of companies.

I don’t think it’s very likely, but think of it like an auction. In a room with 100 people 99 of them should think the winner overpaid. In general most people should feel a given startup was overvalued and only looking back will some of these deals look like a good investment.


WSJ reported today that ChatGPT has 250M weekly users. 10x that would be nearly the the majority of internet users. 100x that would be significantly more than the population of Earth.


Someone can be a direct user, be on some companies corporate account, and an indirect user via 3rd parties using OpenAI on their backend.

As long as we’re talking independent revenue streams it’s worth counting separately from an investment standpoint.


> I wouldn't even be surprised if they were losing money on paying ChatGPT users on inference compute alone

I'd be surprised it that was the case. How many tokens is the average user going through? I'd be surprised if the avg user even hit a 1m tokens much less 20m.


With o1? A lot.

Even for regular old 4o: You’re comparing to their API rates here, which might or might not cover their compute cost.


o1 is about $2-$4 per message over the API. I'm probably costing OpenAI less than 24hrs after my subscription renewal each month.

Voice mode is around $0.25 per minute via API. I don't use that much, but 3 minutes ago per day would already exceed the cost of a ChatGPT Plus subscription by quite a bit.


> o1 is about $2-$4 per message over the API

I’m not sure I understand this, sorry. I see GPT-4o at $3.75 per million input tokens and $10 per million output tokens, on OpenAI’s pricing page.

That’s expensive and I can’t see how they can run Copilot on the standard API pricing. But, it makes a message (one interaction?) lower cost than $4 to me.

How many tokens are in a typical message for you?


I think this might be the article you mentioned

https://benn.substack.com/p/do-ai-companies-work


That was it, thank you!


Thought of this way makes AI companies remarkably similar to Bitcoin mining companies which always just barely stay ahead of the difficulty increases and often fail.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: