Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m almost entirely sure Litwintschik is misinformed in regards to the GPUs in his laptop.

Yes, he does have the Intel GPU he mentioned, but if he paid $200 to upgrade the GPU as he claims, he would also have a dedicated AMD Radeon Pro 5500M 8GB.



Actually, I don't think it's possible to configure a 16" MBP without a discrete GPU - they all come with AMDs. Only the 13" MBPs can be configured without one, but he says he's using a 16".


His other benchmarks of OmniSciDB that ran on systems with Nvidia GPUs, but I think he's pointing out that the in this case, the AMD GPU wasnt used by the DB engine even if it was part of the system config.


Yeah, I just checked and it appears to be a CUDA project, so neither the Intel nor AMD GPUs help here.

Still, that could have been clarified in the article.


Just to clarify, most of the query engine is built around LLVM-based JIT compilation, and CUDA is not really used per say except for GPU-specific operators like atomic aggregates and thread synchronization, and of course we use the driver API to manage the GPUs, allocate memory, etc. Supporting AMD GPUs or the upcoming Intel Xe GPUs (or frankly anything that has an LLVM-backend) would not be particularly hard, it would just require adding similar supporting infra.


It was actually possible during the 2013-2015 Retina 15” models.


He may have been going by "About this Mac", which will show the Intel GPU if you're not plugged into an external monitor, or using a program that activates the dedicated GPU.


Sorry, I just cleaned that paragraph up. Cheers for the heads up.


Have you updated the webpage yet? I’m still seeing the same paragraph.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: