Hacker News new | past | comments | ask | show | jobs | submit login

Of course a GPU from Nvidia is also a NPU. People are spending billions each month on Nvidia, because it's a great NPU.

The fact is that a GPU from Nvidia is a much faster NPU than a CPU from Apple.

It is marketing as you say, but it's misleading marketing, on purpose. They could have simply written "the fastest integrated NPU of any CPU" instead. This is something Apple often does on purpose, and people believe it.




A GPU does other things. It’s designed to do something else. That’s why we call it a GPU.

It just happens to be it’s good at neural stuff too.

There’s another difference too. Apple’s NPU is integrated in their chip. Intel and AMD are going the same. A 4090 is not integrated into a CPU.

I’m somewhat guessing. Apple said NPU is the industry term, honestly I’d never heard it before today. I don’t know if the official definition draws a distinction that would exclude GPUs or not.

I simply think the way Apple presented things seemed reasonable. When they made that claim the fact that they might be comparing against a 4090 never entered my mind. If they had said it was the fastest way to run neural networks I would have questioned it, no doubt. But that wasn’t the wording they used.


> A GPU does other things.

Yes, and so does the M4.

> It just happens to be it’s good at neural stuff too.

No, it's no coincidence. Nvidia has been focusing on neural nets, same as Apple.

> There’s another difference too. Apple’s NPU is integrated in their chip.

The neural processing capabilities of Nvidia products(Tensor Cores) are also integrated in the chip.

> A 4090 is not integrated into a CPU.

Correct, but nobody ever stated that. Apple stated that M4 was faster than any AI PC today, not that it's the fastest NPU integrated into a CPU. And by the way, the M4 is also a GPU.

> I don’t know if the official definition draws a distinction that would exclude GPUs or not.

A NPU can be a part of a GPU, a CPU or it's own chip.

> If they had said it was the fastest way to run neural networks I would have questioned it,

They said fastest NPU, neural processing unit. It's the term Apple and a few others use for their AI accelerator. The whole point of a AI accelerator is performance and efficiency. If something does a better job at it then it's a better AI accelerator.


NVidia GPUs basically have an NPU, in the form of Tensor units. They don't just happen to be good at matmul, they have specific hardware designed to run neural networka.

There is no actual distinction. A GPU with Tensor cores(=matmul units) really does have an NPU just as much as a CPU with an NPU (=matmul units).


You know G in GPU stands for Graphics, right? So if you want to play a game of words, NVidia's device dedicated to something else is 30 times faster than "fastest" Apple's device dedicated specifically to neural processing.


At that point you could just call a GPU a CPU. There are manful distinctions to be made based on what the chip is used for exclusively.


I heard a joke recently that the 'G' in GPU should now be considered to mean "general".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: