Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A few years ago AMD split off their GPU architectures to CDNA (focused on data center compute) and RDNA (focused on rendering for gaming and workstations). This in itself is fine and what Nvidia was already doing, it makes sense to optimize silicon for each use case, but where AMD took a massive wrong turn is that they decided to stop supporting compute completely for their RDNA (and all legacy) cards.

I'm not sure exactly what AMD expected to happen when doing that, especially when Nvidia continues to support CUDA on basically every GPU they've ever made: https://developer.nvidia.com/cuda-gpus#compute (looks like back to a GeForce 9400 GT, released in 2008)



Its like they don't care about having a pipeline of programmers ready to use their hardware, and want to ignore most of the workstation market.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: