Hacker News new | past | comments | ask | show | jobs | submit login

> Don't people rely on Nvidia for deep learning workflows? I thought that stuff ran on Linux?

It all comes down to there always being two ways to do things when interacting with a GPU under Linux: The FOSS/vendor-neutral way, and the Nvidia way.

The machine learning crowd has largely gone the Nvidia way. Good luck getting your CUDA codebase working on any other GPU.

The desktop Linux crowd has largely gone the FOSS route. They have software that works well with AMD, Intel, VIA, and other manufacturers. Nvidia is the only one that wants vendor-specific code.




Thanks - makes sense.

> Nvidia is the only one that wants vendor-specific code.

Isn't that because CUDA is better and that tight software/hardware integration is more powerful?

If it wasn't then presumably people would be using AMD GPUs for their deep learning, but they're not.


Even when it comes to graphics and CUDA is not being used, Nvidia does not support the standard Linux APIs that every other GPU supports.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: