Yes. Vast majority of computing is still under powered. Chromebook for example. Apple Silicon fanless MacBook Air only arrives in 2021. And I would argue if we want AR or latency sensitive applications our computing power is still off by at least an order of magnitude.
That’s not true in many domains where doing it on a personal computer would be either too long or too long in asfar as you are are skillful as using faster memory as cache.
video production, climate simulations, pdes, protein folding, etc etc
I agree with you; all of those needed vastly more computing than was available in a PC. If anything, the power of modern hardware has made a lot of it more available in personal workstations. Though it is true that hyped-for-the-masses personal computing devices are not optimized in that direction. You get what you buy.
Likewise it made neural nets trainable on home PCs, at sizes that weren’t in the past. Unfortunately for GP as well as for many people in disciplines I quoted, problem sizes grew farther, esp now that compute+electricity directly translates to cash.
The part that is especially annoying there is that it's not just about speed, but about AI tools being closely tied to a specific architecture. Lots of them only work on Nvidia cards, but not on AMD. A fallback to CPU is often not provided either. If you don't have enough VRAM a lot of them won't work at all, not just run slower.
In the past you could do almost anything on a personal computer, it was generally about as fast as mainframe or high end workstation.
Training large AI models is currently impossible for most home users. They just do not have the processing power.