To cost per user for Google and FB was/is almost insignificant(relative to LLMs). So all the ad revenue was almost free cash.
It’s not even clear if OpenAI is breaking even with the $20 subscription just on GPU/compute costs alone (the newer models seem to be a lot faster so maybe they are). So incrementally growing their revenue might be very painful if they keep making the UX worse with extra ads while still simultaneously losing money on every user.
Presumably the idea is that costs will go down as HW became faster and models themselves more optimized/efficient. But LLMs themselves already seem to almost be a commodity so it might become tricky for OpenAI to compete with a bunch of random services using open models that are offering the same thing (while spending a fraction on R&D)
It’s not even clear if OpenAI is breaking even with the $20 subscription just on GPU/compute costs alone (the newer models seem to be a lot faster so maybe they are). So incrementally growing their revenue might be very painful if they keep making the UX worse with extra ads while still simultaneously losing money on every user.
Presumably the idea is that costs will go down as HW became faster and models themselves more optimized/efficient. But LLMs themselves already seem to almost be a commodity so it might become tricky for OpenAI to compete with a bunch of random services using open models that are offering the same thing (while spending a fraction on R&D)