Hacker News new | past | comments | ask | show | jobs | submit login

I'm not sure I see the bear argument for NVidia here. Huge AI models certainly drive NVidia sales, but huge AI models are also widely thought to be untrainable and nearly un-runnable save for large datacenters.

To me, this is ripe for an application of the Jevons paradox. If architectural improvements make similar models cheaper, I would expect to see more of them trained and deployed, not fewer, ultimately increasing the market for GPU-like hardware.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: