Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It would probably take a lot of investment from AMD to make that happen, I imagine

Don't AMD deliberately gimp their consumer cards to prevent cannibalising the pro cards? I vaguely recall reading about that a while back.

That being the case, they have already done the R&D but they chose to use the tech on the higher-margin kit, thus preventing hobbyists from buying AMD.



A few years ago AMD split off their GPU architectures to CDNA (focused on data center compute) and RDNA (focused on rendering for gaming and workstations). This in itself is fine and what Nvidia was already doing, it makes sense to optimize silicon for each use case, but where AMD took a massive wrong turn is that they decided to stop supporting compute completely for their RDNA (and all legacy) cards.

I'm not sure exactly what AMD expected to happen when doing that, especially when Nvidia continues to support CUDA on basically every GPU they've ever made: https://developer.nvidia.com/cuda-gpus#compute (looks like back to a GeForce 9400 GT, released in 2008)


Its like they don't care about having a pipeline of programmers ready to use their hardware, and want to ignore most of the workstation market.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: