Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A whole butt ton of GPU-style FMA and low precision float multiply ALUs would be my guess


How low precision can you get and still have it be useful?


Google's TPU proves that 8 bits is still good.


Note that the TPU used 8 bit integer math, not even floating point.


1-bit, binary neural networks work.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: