According to nvidia’s 2025 annual report [1], 34% of their sales for 2025 comes from just 3 customers.
Additionally, they mentioned that customers can cancel purchases with little to no penalty and notice [2].
This is not unique for hardware companies, but to think that all it takes is just one company to get their sales down by 12% (14b$).
To cut to the point, my guess is that nvidia is not sustainable, and at some point one or more of these big customers won’t be able to keep up with the big orders, which will cause them to miss their earnings and then it will burst. But maybe i’m wrong here.
[2] same, page 116:
> Because most of our sales are made on a purchase order basis, our customers can generally cancel, change, or delay product purchase commitments with little notice to us and without penalty.
I have lots of skepticism about everything involved in this, but on this particular point:
It's a bit like TSMC: you couldn't buy space on $latestGen fab because Apple had already bought it all. Many companies would have very much liked to order H200s and weren't able to, as they were all pre-sold to hyperscalers. If one of them stopped buying, it's very likely they could sell to other customers, though there might be more administrative overhead?
Now there are some interesting questions about Nvidia creating demand by investing huge amounts of money in cloud providers that will order nv hardware, but that's a different issue.
Its probably not very likely that if a large buyer pulled out, NVIDIA could just sell to other customers. If a large buyer pulls out, that's a massive signal to everyone else to begin cutting costs as well. The large buyer either knows something everyone else doesn't, or knows something that everyone else has already figured out. Either way, the large buyer pulling out signals "I don't think the overall market is large enough to support this amount of compute at these prices at current interest rates" and everybody is doing the same math too.
None of those customers can afford to cancel their orders. OpenAI, Google and Meta cannot afford to get cheap on GPUs when presumably they believe GAI is around the corner. The first company to achieve GAI will win because at that point all gains will become exponential.
All the AI companies are locked in a death loop where they must spend as much money as possible otherwise everything they invested will immediately become zero. No one is going to pay for an LLM when the competitor has GAI. So it's death loop for everyone that has become involved in this race.
I don't know why you are being downvoted. What you said makes sense to me but I understand I know very little about how companies think. Can someone with a differing point of view elaborate?
No idea why the downvotes, these are valid points. I still don’t fully agree with it:
1. There are alternatives to nvidia: these 3 companies are probably developing their own alternative to NVIDIA, at some point they will switch to their solution or to competitors (for example: google used TPUs to train Gemini 3 [1], with no nvidia GPUs, despite being a pretty large Nvidia customer).
2. The market seems to be consolidating: for example Apple has decided to use Google Gemini for their new Siri [2]. I’m not an export (or future teller), but I think it increases the chance that other companies might follow and get off the AI race.
3. I am sure that OpenAI and related companies would want to sustain these kind of orders, but I am not sure it is possible without more and more funding, and I don’t know if even Sam himself know to estimate how many GPUs they will be able to buy from Nvidia in 2026.
Edit:
Kinkora implies that it has something to do with kinks, at least that was my first impression.
My guess is that it means something in another language, but maybe this is not a good first association that you would want for a AI image generation product that can be used in a professional setting.
We’ve noticed that the name creates unintended associations for some users, especially in English, and that’s not what we want to emphasize going forward.
We’re actively discussing a rebrand to better reflect the creative and model-focused direction of the product.
We’ve noticed that the name creates unintended associations for some users, especially in English, and that’s not what we want to emphasize going forward.
We’re actively discussing a rebrand to better reflect the creative and model-focused direction of the product.
I think that reading all of the information from the SSD should “recharge” it in most cases. The SSD controller should detect any bit flips and be able to correct them.
However, this is implementation detail in the SSD FW. For Linux UBI devices, this will suffice.
It will trigger reads in random areas in flash, and try ti correct any errors found.
Without it, the same issue as in the original article will happen (even if the device is powered on): areas in the NAND were not read for long time will have more and more errors, causing them to be non recoverable.
Hopefully it will make Qualcomm behave more like Arduino and not the opposite.
Qualcomm is one of the worse companies I have had the pleasure to work with.
Their support model is hellish and they provide very little information and documentation, so usually you’ll end up doing a lot of guessing and reverse engineering. They will tell you to sign a contract with one of their “design partners”, but even they can’t get answers for basic questions.
Seriously, if they want more small cap companies working with them they have to treat them better, I worked with them as a small company and as a larger company and in both cases their support was basically non existent even if we were buying chips from them for more than 10m$ a year.
Qcom is a corporate behemoth, much like Oracle. In the immortal words of Bryan Cantrill, it is a lawnmower and if you stick your hand in it you'll get it chopped off.
Tie that chip to a beamformer (silicon labs have a few) and you have a phased array radar, which is a radar that does not move at all (pretty cool in my opinion)
Also, 15usd is not cheap for this kind of chip. You can buy a full wifi 7 rf/modem or a 4 core arm64 soc with this kind of money.
You can't use an external beamformer with this chip; it has the antenna built into the package itself. The chip doesn't have pins for RF input/output to bypass the built-in antenna.
60GHz radar is very different from WiFi. 15USD actually seems about right for the functionality this chip offers.
I think that while data is a major point here, in my opinion, these are the reasons apps are preferred by developers:
1. Persistence: while websites are very easy to close, deleting an app is much more difficult and usually requires pressing on some “red buttons” and scary dialogs. It also makes sure the user now has a button for your app on their Home Screen which makes it a lot more accessible.
2. Notifications: while they exist for websites too, they are much less popular and turned off by default. Notifications are maybe the best way to get the user to use your app.
And while I hate the dark patterns some companies use (Meta, AliExpress, etc), I do understand why installing the app worth so much to them.
And why does a developer care about those things if not for the fact it means they can collect data even when the user isn’t actively using the service?
> And why does a developer care about those things...
I have several apps on my phone where I am interested in receiving notifications.
1. Airline app. While traveling I need to know about gate changes, flight time changes, etc. etc.
2. Credit card app. I have turned on notifications for all changes above $10.
3. Bank app. I have turned on notifications for all transfers.
4. Moen water meter app. If there is a water leak at my house, I need to know.
5. Server monitor app. If my website goes down, I need to know right away.
6. Google smoke detector. If there is smoke in my house, I need to know right away.
7. Tesla app. If I didn't close the door properly and walked away, the app lets me know.
8. Security camera app. If there is unexpected movement at my home or office, I get an alert.
9. WhatsApp and other messaging apps. When someone sends me a message, I get an alert.
And those are only the things that immediately come to mind. If you were a developer of some of these apps, would you be able to provide these same functions in a user friendly way with a web app? Genuinely curious.
I actually do not want your garbage persisting on my machine and if you want to notify me you can ask for my email and maintain the required infrastructure to send me notification emails.
[1] https://discuss.privacyguides.net/t/updated-cellebrite-iphon... : support matrix from 2024, in many cases only AFU (after first unlock) is supported.
reply