> I answered your question with a joke because your question is a joke, what doesn't benefit from a 10-fold increase in processing power?
You misread the question pretty badly. They asked for new use cases, not things that would benefit.
We can fold proteins pretty well now, 10x wouldn't fundamentally change things.
There was a point where 10x in power made VR feasible. We're past that point, and can already do 144Hz VR with pretty good graphics. More power would increase the resolution but that's not a new use case.
Deep learning, hmm. A 10x increase in GPU RAM would be pretty great for using things like GPT-3, but the limiting factor there isn't really processing power.
Remember that you said "sorely need". That's a much stronger statement than benefiting.
> Why would no new use case arise from a widely available 10-fold increase in processing power when the last 40 years have shown that new tech always materializes?
Well, I can look back about a decade for the last 10-fold increase. Everyone has an SSD now, that's major but not based on processing power. VR is the only significant new use case I can name. That's cool but it's not exactly a big impressive list.
> Well, I can look back about a decade for the last 10-fold increase.
A decade ago (2011) we didn't even have "deep learning" with the real increase starting around 2013 when GPUs became good enough that things like AlexNet/ImageNet were practical so that's one.
> You misread the question pretty badly. They asked for new use cases, not things that would benefit.
I answered their question pretty directly, and I address your point in my reply as well. 10-fold increase in computing power brought us affordable smartphones (in 2011 iPhones existed but were mostly for the well-fortunated among us).
10-fold increase in processing power brought us the video streaming services of today. Without modern processing capabilities Netflix and co. probably wouldn't exist in any capacity. A 100Gpbs network interface just wasn't a thing in 2011 and it's not because engineers didn't have the idea.
A new Pixel 6 can erase people from my pictures automatically on-device, that's another very nice use case that wasn't possible with the hardware of even five years ago.
If you want to argue that a 10 years horizon wasn't large enough then sure, but there are literally thousands of use cases that only exist because computing scales and gets cheaper every year.
You misread the question pretty badly. They asked for new use cases, not things that would benefit.
We can fold proteins pretty well now, 10x wouldn't fundamentally change things.
There was a point where 10x in power made VR feasible. We're past that point, and can already do 144Hz VR with pretty good graphics. More power would increase the resolution but that's not a new use case.
Deep learning, hmm. A 10x increase in GPU RAM would be pretty great for using things like GPT-3, but the limiting factor there isn't really processing power.
Remember that you said "sorely need". That's a much stronger statement than benefiting.
> Why would no new use case arise from a widely available 10-fold increase in processing power when the last 40 years have shown that new tech always materializes?
Well, I can look back about a decade for the last 10-fold increase. Everyone has an SSD now, that's major but not based on processing power. VR is the only significant new use case I can name. That's cool but it's not exactly a big impressive list.