Have you tried groq.com? Because I don't think gpt-4o is "incredibly" fast. I've been frustrated at how slow gpt-4-turbo has been lately, and gpt-4o just seems to be "acceptably" fast now, which is a big improvement, but still, not groq-level.
Yes, of course, probably sometime in the following days. Some people mention it already works in the playground.
I was wondering why OpenAI didn't release a smaller model but faster. 175 billion parameters works well, but speed sometimes is crucial. Like, a 20b parameters model could compute 10x faster.
I also see advertising (especially lower-budget productions, such as dropshipping or local TV commercials) being early adopters of this technology once businesses have access to this at an affordable price.