Nvidia has gotten lucky repeatedly. The GPUs were great for PC gaming and they were the top dog. The crypto boom was such an unexpected win for them partly because Intel killed off their competition by acquiring it. Then the AI boom is also a direct result of Intel killing off their competition but the acquisition is too far removed to credit it to that event.
Unlike the crypto boom though, two factors make me think the AI thing was bound to go away quickly.
Unlike crypto there is no mathematical lower bound for computation, and if you see technology's history we can tell the models are going to get better/smaller/faster overtime reducing our reliance on the GPU.
Crypto was fringe but AI is fundamental to every software stack and every company. There is way too much money in this to just let Nvidia take it all. One way or another the reliance on it will be reduced
> the models are going to get better/smaller/faster overtime reducing our reliance on the GPU
Yes, because we've seen that with other software. I no longer want a GPU for my computer because I play games from the 90s and the CPU has grown powerful enough to suffice... except that's not the case at all. Software grew in complexity and quality with available compute resources and we have no reason to think "AI" will be any different.
Are you satisfied with today's models and their inaccuracies and hallucinations? Why do you think we will solve those problems without more HW?
because that's what history shows us. back in the 90s, MPEG-1/2 took dedicated hardware expansion cards to handle the encoding because software was just too damn slow. eventually, CPUs caught up, and dedicated instructions were added to the CPU to make software encoding multiple times faster than real-time. Then, H.264 came along and CPUs were slow for encoding again. Special instructions were added to the CPU again, and software encoding is multiple times faster again. We're now at H.265 and 8K video where encoding is slow on CPU. Can you guess what the next step will be?
Not all software is written badly where it becomes bloatware. Some people still squeeze everything they can, admittedly, the numbers are becoming smaller. Just like the quote, "why would I spend money to optimize Windows when hardware keeps improving" does seem to be group think now. If only more people gave a shit about their code vs meeting some bonus accomplishment
But seriously, video encoding isn't AI. Video encoding is a well understood problem. We can't even make "AI" that doesn't hallucinate yet. We're not sure what architectures will be needed for progress in AI. I get that we're all drunk on our analogies in the vacuum of our ignorance but we need to have a bit of humility and awareness of where we're at.
Including considering that it can't be made much better, that the hallucinations are a fundamental trait that cannot be eliminated, that this will all come tumbling down in a year or three. You seem to want to consider every possible positive future if we just work harder or longer at it, while ignoring the most likely outcomes that are nearer term and far from positive.
Conversely, can you name one computing thing that used to be hard when it was first created that is still hard in the same way today after generations of software/hardware improvements?
Simulations and pretty much any large scale modelling task. Why do you think people build supercomputers?
Now that I mentioned it, I think supercomputers and the jobs they run are the perfect analog for AI at this stage. It's a problem that we could throw nearly limitless compute at if it were cost effective to do so. HPC encompasses a class of problems for which we have to make compromises because we can't begin to compute the ideal(sort of like using reduced precision in deep-learning). HPC scale problems have always been hard and as we add capabilities we will likely just soak them up to perform more accurate or larger computational tasks.
To quote Andrej Karpathy
(https://x.com/karpathy/status/1883941452738355376): "I will say that Deep Learning has a legendary ravenous appetite for compute, like no other algorithm that has ever been developed in AI. You may not always be utilizing it fully but I would never bet against compute as the upper bound for achievable intelligence in the long run. Not just for an individual final training run, but also for the entire innovation / experimentation engine that silently underlies all the algorithmic innovations."
VR is as dead this time around as it was in the mid-2000s and the mid-1990s and the mid-1980s, each of the times I've used it it was just as awful as before with nausea, eyestrain, headaches, neck and face fatigue, it's truly a f**ed space and it's failed over and over, this time with Apple and Facebook spending tens of billions on it. VR is a perfect reply to your question here.
Honestly, you'd be shocked at how much gaming you can get done on the integrated gpus that are just shoved in these days. Sure, you won't be playing the most graphically demanding things, but think of platforms like the Switch, or games like Stardew. You can easily go without a dedicated GPU and still have a plethora of games.
And as for AI, there's probably so much room for improvement on the software side that it will probably be the case that the smarter, more performant AIs will not necessarily have to be on the top of the line hardware.
I think the point was not that we won't still use a lot of hardware, it's that it won't necessarily always be Nvidia. Nvidia got lucky when both crypto and AI arrived because it had the best available ready-made thing to do the job, but it's not like it's the best possible thing. Crypto eventually got its ASICs that made GPUs uncompetitive after all.
the aaa games industry is struggling (e.g. look at the profit warnings, share price drops and studio closures) specifically because people are doing that en masse.
but those 90s games are not old - retro has become a movement within gaming and there is a whole cottage industry of "indie" games building that aesthetic because it is cheap and fun.
Money isn’t fringe, and the target for crypto is all transactions, rather than the existing model where you pay between two and 3.5% to a card company or other middleman.
Credit card companies averaged over 22,000 transactions per second in 2023 without ever having to raise the fee. How many is crypto even capable of processing? Processing without the fee going up? What fraud protection guarantees are offered to the parties of crypto transactions?
Does everyone just need to get out of Bitcoin and get into Solana before a stampede happens? If Bitcoin crashes, all coins will crash, because there's hundreds of them to choose from. You're playing with tulips.
Yes, you are right. The traditional financial system is indeed more popular than crypto.
I’m not sure what your point is, but yes, I absolutely agree.
Obviously, that has no effect of the capacity of crypto to take over the volume of existing financial transactions and largely replace existing middle men.
Random old tech from 2015 also had wildly fluctuating transaction fees. Likewise I can’t run call of duty on my ZX spectrum. I’m not sure what your point is there either, but yes, I agree. Obviously old tech being old doesn’t affect the capabilities of new tech, and the vast majority of payments are done on Solana rather than these old networks.
> Come on now, you know I was referring to consumer fraud
No I didn’t. But it was late.
My point, that crypto already has the capacity and the low fees remains unscarred.
Unlike the crypto boom though, two factors make me think the AI thing was bound to go away quickly.
Unlike crypto there is no mathematical lower bound for computation, and if you see technology's history we can tell the models are going to get better/smaller/faster overtime reducing our reliance on the GPU.
Crypto was fringe but AI is fundamental to every software stack and every company. There is way too much money in this to just let Nvidia take it all. One way or another the reliance on it will be reduced