What you're saying does happen to some degree and in this instance, if i had linked some advance with a diffusion model then i would get it but about the only difference between this and chatgpt is the data it's been trained on.
If Open AI cared, the next version of GPT could be a State of the Art weather predictor.
I mean by the same logic the only difference between a diffusion model and a VLM is that you put the spatial transformer on the other end.
Yes, one of the powerful things about every kind of neural network is that they're a very general class of function approximator. That we can use a similar toolkit of techniques to tackle a wide variety of problems is very cool and useful. Again, the analogy to statistical models is telling. You can model a lot of phenomena decently well with gaussian distributions. Should we report this as "Normal distribution makes yet another groundbreaking discovery!"? Probably this wouldn't have the same impact, because people aren't being sold sci-fi stories about an anthropomorphized bell curve. People who are using LLMs already think of "AI" as a thinking thing they're talking to, because they have been encouraged to do that by marketing. Attributing these discoveries made by scientists using this method to "AI" in the same way that we attribute answers produced by chatGPT to "AI" is a deliberate and misleading conflation
>I mean by the same logic the only difference between a diffusion model and a VLM is that you put the spatial transformer on the other end.
Maybe if that was the only different but it's not. There are diffusion models that have nothing to do with transformers or attention or anything like that and where using them for arbitrary sequence prediction is either not possible or highly non-trivial.
Yes, All Neural Network architectures are function approximators but that doesn't they excel equally for all tasks or that you can even use them for anything other than a single task. This era of the transformer where you can simply use a single architecture for NLP, Computer Vision, Robotics, even reinforcement learning is a very new one. Literally anything a bog standard transformer can do is anything GPT can do if Open AI wished.
Like i said, i don't disagree with your broader point. I just don't think this is an instance of it.
It's clear you're missing what point it is that I'm making from these responses, but I'm unsure how to explain it better and you're not really giving me much to work with in terms of seeming to engage with the substance of it, so I think we gotta leave this an impasse for now