Noob question: It sounds like the statement is “we learned it’s cheap to copy an already built model” from its outputs, but is it still expensive to train a new (better) base model?
If so, is this mostly a concern because there’s little moat available now to those who pay to train the better base models?
nope you have the article right. "if you copy an existing model, you can get a pretty comparable model!"
Really, [what the underlying paper](https://arxiv.org/abs/2501.19393) shows is how much you can improve existing models with several of the techniques released by openai recently (extending thought processes via "wait")
It shows we can get a lot of mileage out of that technique.