Hacker News new | past | comments | ask | show | jobs | submit login

Nvidia has gotten lucky repeatedly. The GPUs were great for PC gaming and they were the top dog. The crypto boom was such an unexpected win for them partly because Intel killed off their competition by acquiring it. Then the AI boom is also a direct result of Intel killing off their competition but the acquisition is too far removed to credit it to that event.

Unlike the crypto boom though, two factors make me think the AI thing was bound to go away quickly.

Unlike crypto there is no mathematical lower bound for computation, and if you see technology's history we can tell the models are going to get better/smaller/faster overtime reducing our reliance on the GPU.

Crypto was fringe but AI is fundamental to every software stack and every company. There is way too much money in this to just let Nvidia take it all. One way or another the reliance on it will be reduced






> the models are going to get better/smaller/faster overtime reducing our reliance on the GPU

Yes, because we've seen that with other software. I no longer want a GPU for my computer because I play games from the 90s and the CPU has grown powerful enough to suffice... except that's not the case at all. Software grew in complexity and quality with available compute resources and we have no reason to think "AI" will be any different.

Are you satisfied with today's models and their inaccuracies and hallucinations? Why do you think we will solve those problems without more HW?


because that's what history shows us. back in the 90s, MPEG-1/2 took dedicated hardware expansion cards to handle the encoding because software was just too damn slow. eventually, CPUs caught up, and dedicated instructions were added to the CPU to make software encoding multiple times faster than real-time. Then, H.264 came along and CPUs were slow for encoding again. Special instructions were added to the CPU again, and software encoding is multiple times faster again. We're now at H.265 and 8K video where encoding is slow on CPU. Can you guess what the next step will be?

Not all software is written badly where it becomes bloatware. Some people still squeeze everything they can, admittedly, the numbers are becoming smaller. Just like the quote, "why would I spend money to optimize Windows when hardware keeps improving" does seem to be group think now. If only more people gave a shit about their code vs meeting some bonus accomplishment


> Can you guess what the next step will be?

H.266 32K encoding being slow on cpu


> Can you guess what the next step will be?

He fixes the cable?

But seriously, video encoding isn't AI. Video encoding is a well understood problem. We can't even make "AI" that doesn't hallucinate yet. We're not sure what architectures will be needed for progress in AI. I get that we're all drunk on our analogies in the vacuum of our ignorance but we need to have a bit of humility and awareness of where we're at.


Including considering that it can't be made much better, that the hallucinations are a fundamental trait that cannot be eliminated, that this will all come tumbling down in a year or three. You seem to want to consider every possible positive future if we just work harder or longer at it, while ignoring the most likely outcomes that are nearer term and far from positive.

There are already strategies to reduce hallucinations but, guess what? I'll let you fill in the rest.

Conversely, can you name one computing thing that used to be hard when it was first created that is still hard in the same way today after generations of software/hardware improvements?

Simulations and pretty much any large scale modelling task. Why do you think people build supercomputers?

Now that I mentioned it, I think supercomputers and the jobs they run are the perfect analog for AI at this stage. It's a problem that we could throw nearly limitless compute at if it were cost effective to do so. HPC encompasses a class of problems for which we have to make compromises because we can't begin to compute the ideal(sort of like using reduced precision in deep-learning). HPC scale problems have always been hard and as we add capabilities we will likely just soak them up to perform more accurate or larger computational tasks.

To quote Andrej Karpathy (https://x.com/karpathy/status/1883941452738355376): "I will say that Deep Learning has a legendary ravenous appetite for compute, like no other algorithm that has ever been developed in AI. You may not always be utilizing it fully but I would never bet against compute as the upper bound for achievable intelligence in the long run. Not just for an individual final training run, but also for the entire innovation / experimentation engine that silently underlies all the algorithmic innovations."


VR is as dead this time around as it was in the mid-2000s and the mid-1990s and the mid-1980s, each of the times I've used it it was just as awful as before with nausea, eyestrain, headaches, neck and face fatigue, it's truly a f**ed space and it's failed over and over, this time with Apple and Facebook spending tens of billions on it. VR is a perfect reply to your question here.

um, not really. just because a tech does not gain popularity and withers on the vine has nothing to do with code and hardware maturity

you're rant on VR is just weird and out of place here


The Entscheidungsproblem, from the 17th century to today and forever in the future.

3D graphics.

Honestly, you'd be shocked at how much gaming you can get done on the integrated gpus that are just shoved in these days. Sure, you won't be playing the most graphically demanding things, but think of platforms like the Switch, or games like Stardew. You can easily go without a dedicated GPU and still have a plethora of games.

And as for AI, there's probably so much room for improvement on the software side that it will probably be the case that the smarter, more performant AIs will not necessarily have to be on the top of the line hardware.


Just look at how much insects get done with just a few neurons to run together...

Those neurons aren't language models though. They're not encoding petabytes of human knowledge.

Those neurons aren't anything remotely similar to a "neuron" in an LLM, for instance.

I think the point was not that we won't still use a lot of hardware, it's that it won't necessarily always be Nvidia. Nvidia got lucky when both crypto and AI arrived because it had the best available ready-made thing to do the job, but it's not like it's the best possible thing. Crypto eventually got its ASICs that made GPUs uncompetitive after all.

interesting take.

the aaa games industry is struggling (e.g. look at the profit warnings, share price drops and studio closures) specifically because people are doing that en masse.

but those 90s games are not old - retro has become a movement within gaming and there is a whole cottage industry of "indie" games building that aesthetic because it is cheap and fun.


Is it luck, or is scaling arithmetic genuinely a useful capability to offer the world?

> Unlike crypto there is no mathematical lower bound for computation

this is what I feel, but is there any scientific proof on that?


Yes, the human brain

Money isn’t fringe, and the target for crypto is all transactions, rather than the existing model where you pay between two and 3.5% to a card company or other middleman.

Credit card companies averaged over 22,000 transactions per second in 2023 without ever having to raise the fee. How many is crypto even capable of processing? Processing without the fee going up? What fraud protection guarantees are offered to the parties of crypto transactions?

> Credit card companies averaged over 22,000 transactions per second in 2023 without ever having to raise the fee.

Oh good because the fee is insanely high.

> How many is crypto even capable of processing?

1M a second including voting transactions, divide by 4 for non-voting TPS.

https://youtu.be/8sl3RcN2Rdk?si=saRTd-fQqG1-L_kb

Transaction fee for a simple transfer is a fraction of a penny.

I’m not sure if your last question is regarding consumer fraud or cryptographic fraud proofs so I won’t answer it until you clarify.


Most money and most transactions aren't in Solana. Solana processes ~15B /day. Credit card companies, 15T. Solana has only ever reached 2k/s

https://capitaloneshopping.com/research/number-of-credit-car...

Come on now, you know I was referring to consumer fraud. And there is no protection, unless you go off chain.

Transaction fees in the most common coin, Bitcoin, vary wildly. The spike rose to over $100 at one point.

https://ycharts.com/indicators/bitcoin_average_transaction_f...

Does everyone just need to get out of Bitcoin and get into Solana before a stampede happens? If Bitcoin crashes, all coins will crash, because there's hundreds of them to choose from. You're playing with tulips.


Yes, you are right. The traditional financial system is indeed more popular than crypto. I’m not sure what your point is, but yes, I absolutely agree.

Obviously, that has no effect of the capacity of crypto to take over the volume of existing financial transactions and largely replace existing middle men.

Random old tech from 2015 also had wildly fluctuating transaction fees. Likewise I can’t run call of duty on my ZX spectrum. I’m not sure what your point is there either, but yes, I agree. Obviously old tech being old doesn’t affect the capabilities of new tech, and the vast majority of payments are done on Solana rather than these old networks.

> Come on now, you know I was referring to consumer fraud

No I didn’t. But it was late.

My point, that crypto already has the capacity and the low fees remains unscarred.


No response to Bitcoin being woefully inadequate in terms of capacity? Do you just think everyone is going to switch to your preferred coin? Oof.

Oh, and nice indignation to avoid discussion of fraud protection. The only one relevant to both is fraud protection.

Well, good luck to you.


Agreement re: bitcoin being woefully inadequate. See point about ZX spectrum not being able to play call of duty.

> Do you just think everyone is going to switch

I don’t have to, other modern networks exist but stablecoin transactions are already mainly on Solana, we don’t have to wait to switch.

Happy to discuss consumer fraud protection after you acknowledge a single one of my prior points.


Bitcoin still has 58% of the market.

https://coinmarketcap.com/charts/bitcoin-dominance/

Solana is $5B/day. Bitcoin is $50B/day.

People just don't have value in Solana like they do Bitcoin.


Not in payments. Look up stablecoin transactions. Solana is by far the most active.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: