> Comparing it to Moore's Law doesn't make any sense to me, though.
I assume it's meant as a qualitative comparison rather than a meaningful quantitative one. Sort of a (sub-)cultural touchstone to illustrate a point about which phase of development we're in.
With CPUs, during the phase of consistent year after year exponential growth, there were ripple effects on software. For example, for a while it was cost-prohibitive to run HTTPS for everything, then CPUs got faster and it wasn't anymore. So during that phase, you expected all kinds of things to keep changing.
If deep learning is in a similar phase, then whatever the numbers are, we can expect other things to keep changing as a result.
A phone needing to set up a dozen HTTPS sockets is nothing for the CPU to do even without acceleration. A server needing to consistently set up hundreds of HTTPS sockets is where AES-NI and other accelerated crypto instructions becomes useful.
I assume it's meant as a qualitative comparison rather than a meaningful quantitative one. Sort of a (sub-)cultural touchstone to illustrate a point about which phase of development we're in.
With CPUs, during the phase of consistent year after year exponential growth, there were ripple effects on software. For example, for a while it was cost-prohibitive to run HTTPS for everything, then CPUs got faster and it wasn't anymore. So during that phase, you expected all kinds of things to keep changing.
If deep learning is in a similar phase, then whatever the numbers are, we can expect other things to keep changing as a result.