Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Happy 18th Birthday CUDA (thechipletter.substack.com)
26 points by chmaynard 4 months ago | hide | past | favorite | 5 comments


> Some people confuse CUDA, launched in 2006, for a programming language — or maybe an API. With over 150 CUDA-based libraries, SDKs, and profiling and optimization tools, it represents far more than that.

This is exactly the key point that all wannabe replacements forget, when offering only a C or C++ API, for what is the tip of an iceberg.

Intel and AMD never cared to offer anything besides a barebones OpenCL experience in C, until it was too late to catch up with a polyglot ecosystem (C, C++, Fortran, anything PTX), IDE integration, tooling and libraries.


Joined Nvidia in Oct 2008 and worked on this early CUDA stack. I worked on later stage compiler optimizations where both CUDA and gfx would merge into. CUDA was this “annoyance” we had to deal with. Lots of scenarios that would never show up with gfx code were suddenly possible when code generated by cuda toolchain was considered. Gfx made all the money yet CUDA was treated as first class citizen since the day it was born.


May be I remember it wrong. But when CUDA came out it also replaced Cg ( C for graphics ) which at the time was used or at least aimed to fit a lot of massively parallel workload into it.

While strictly speaking the two are very different. I would tend to think a lot of the idea and work of CUDA came directly from working on Cg. And CUDA's history could at least span to its introduction in 2001.

And of course in a separate interview Jansen claims that was the Goal from day one the company was founded and Graphics was only their starting point.

In any case, their success is anything but luck. Name of a single company that invested into a side quest continuously and rigorously for 15 years before they hit any sort of jackpot. Most would have given up. The beauty of founder company versus returns to investor.


Yes. Jensen lived Andy Grove. Only the paranoid survive. I remember multiple all hands meetings where he would remind everyone the fate of sound card companies. He knew from the beginning that moving from graphics acceleration to fundamental compute was the only way to stay relevant and not get integrated into the CPU/SoC. On one hand he tried becoming that CPU. Via Transmeta acquisition and various attempts at x86-based systems. On the other it was just doubling down on graphics and embracing its complexity and consumer thirst for more to that would buy time for the right parallel applications to come along (machine learning, gen AI) to boost his accelerator into a core piece of hardware. Beautiful story


Thank you for sharing this. I sometimes wished their Grace CPU was more widely available to consumer and running Windows. Unfortunately consumer CPU isn't something they are interested in. Which is leaving the whole market to Qualcomm.

Although in recent months they seems to want to do it via Partnership with Mediatek.

Edit: And I suddenly remember somehow their upcoming ARM CPU uArch is completely new and not some design from ARM Cortex Line.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: