People thought that with LSTMs, and then we got transformers. People thought that with CNNs, and then we got Res-nets.
Progress is always that way. It plateaus, then suddenly jumps and then plateaus again.
If you complaint is about the general move away from statistics and deep learning becoming the norm, then there are a pretty decent number of labs who are working on coming up with whatever the next deep learning is. There is probabilistic programming and there models are some models with newer biologically inspired computation structures.
Even inside ML and deep learning, people are trying to come up with ways to better leverage unsupervised learning and building large common sense representations of the world.
There is certainly an oversupply of applied deeplearning practitioners, but there are other approaches being explored in the AI/ML community too.
Progress is always that way. It plateaus, then suddenly jumps and then plateaus again.
If you complaint is about the general move away from statistics and deep learning becoming the norm, then there are a pretty decent number of labs who are working on coming up with whatever the next deep learning is. There is probabilistic programming and there models are some models with newer biologically inspired computation structures.
Even inside ML and deep learning, people are trying to come up with ways to better leverage unsupervised learning and building large common sense representations of the world.
There is certainly an oversupply of applied deeplearning practitioners, but there are other approaches being explored in the AI/ML community too.