Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Comparing it to Moore's Law doesn't make any sense to me, though.

I assume it's meant as a qualitative comparison rather than a meaningful quantitative one. Sort of a (sub-)cultural touchstone to illustrate a point about which phase of development we're in.

With CPUs, during the phase of consistent year after year exponential growth, there were ripple effects on software. For example, for a while it was cost-prohibitive to run HTTPS for everything, then CPUs got faster and it wasn't anymore. So during that phase, you expected all kinds of things to keep changing.

If deep learning is in a similar phase, then whatever the numbers are, we can expect other things to keep changing as a result.



> then CPUs got faster and it wasn't anymore

The enabling tech was AES-NI instruction set, not the speed.

Agree on the rest. The main reason why modern CPUs and GPUs all have 16-bit floats is probably the deep learning trend.


If it hadn't been aes-ni, it would have been chacha, which is much faster than unaccelerated aes and close to the speed of accelerated aes.

Phones use https without a problem, and those haven't had hw-accelerated aes until recently.


A phone needing to set up a dozen HTTPS sockets is nothing for the CPU to do even without acceleration. A server needing to consistently set up hundreds of HTTPS sockets is where AES-NI and other accelerated crypto instructions becomes useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: