Hacker Newsnew | past | comments | ask | show | jobs | submit | httpz's commentslogin

Not that I agree with the pardons, but former presidents are usually old. Letting your political opponent die in prison can have a massive backlash so most presidents would rather not let that happen.

I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)


There is a couple reasons why Jensen won't take off the gaming leather jacket just yet:

1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

2. Its a market defense to keep other players down and keep them from growing their way into data centers.

3. Its profitable (probably the main reason but boring)

4. Hedge against data center volatility (10 key customers vs millions)

5. Antitrust defense (which they used when they tried to buy ARM)


6. Techies who use NVidia GPUs in their PCs are more likely to play with AI and ultimately contribute to the space as either a developer or a user


7. Maybe just don’t put all your eggs in one basket, especially when that basket is an industry that has yet to materialize its promise.


They'll access GPUs through their company VPN

If they're unemployed, they'll just rent from the cloud

How many of you still manage your own home server?


> 1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

No way that is true any more. Five years ago, maybe.

https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...


on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.

basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!

we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.


Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.


Because when you lose even one of those big companies in your handful, it tanks your business. Customer diversity is a good thing.

And they're not selling a handful of GPUs to nobodies like us; they're selling millions of GPUs to millions of nobodies.


Gaming is now less than 10% of nvidia's revenue. We're really not adding any meaningful diversity to their bottom line anymore.


> Customer diversity is a good thing.

Tell that to Micron.


The way things are going no one will be able to afford a PC.

Instead we will be streaming games from our locked down tablets and paying a monthly subscription for the pleasure.


You will own nothing and be happy.


Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.

The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.

The last decade or so of hardware/tech advances haven't really improved the games.


DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.

Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.

And yeah, our hardware is not capable of proper raytracing at the moment.


> Framegen can be useful but has niche application

Somebody should tell that to the AAA game developers that think hitting 60fps with framegen should be their main framerate target.


The latest DLSS and FSR are good actually. Maybe XeSS too.


The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.

Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.


The literal multi-million dollar question that executives have never bothered asking: When is it enough?

Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.

But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).

Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.


Well in the case of Doom: The Dark ages, it's not just about about fidelity but about scale and production. To make TDA's levels with the baked GI used in the previous game would have taken their artists considerably more time and resulted in a 2-3x growth in install size, all while providing lighting that is less dynamic. The only benefit would have been the ability to support a handful of GPUs slightly older than the listed minimum spec.

Ray tracing has real implications not just for the production pipeline, but the kind of environments designers can make for their games. You really only notice the benefits in games that are built from the ground up for it though. So far, most games with ray tracing have just tacked it on top of a game built for raster lighting, which means they are still built around those limitations.


I'm not even talking about RT, specifically, but overall production quality. Increased texture detail, higher-poly models, more shader effects, general environmental detail, the list goes on.

These massive production budgets for huge, visually detailed games, are causing publishers to take fewer creative risks, and when products inevitably fail in the market the studios get shuttered. I'd much rather go back to smaller teams, and more reasonable production values from 10+ years ago than keep getting the drivel we have, and that's without even factoring in how expensive current hardware is.


I can definitely agree with that. AAA game production has become bloated with out of control budgets and protracted development cycles, a lot of that due to needing to fill massive overbuilt game worlds with an endless supply of unique high quality assets.

Ray tracing is a hardware feature that can help cut down on a chunk of that bloat, but only when developers can rely on it as a baseline.


I think Nvidia realises that selling GPUs to individuals is useful as it allows them to develop locally with CUDA.


This is a huge reason.


They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.


They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.


In the words of Douglas Adams, there are those who say that this has already happened.


Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.


Intel GPUs are probably not going to last much longer, considering they did a deal with nvidia for integrated GPUs.


Jensen is to paranoid to do it. But whoever comes after him will do it ASAP.


They did get burned when crypto switched to dedicated hardware and nvida were left with for them huge surpluses of 10xx series hardware. But what they’re selling to AI companies now is a lot more different from their consumer gear


Keep the retail investors happy so they keep pumping your stock.


Wonder if Google will ever start selling TPUs.




I was thinking large ones, to other AI companies.


The brand aware "consumers" are really just DIY PC builders, which is relatively a small number. Enterprise DRAM business is doing so great that Micron just doesn't see the consumer market is worth chasing.

This is bad for consumers though since DRAM prices are skyrocketing and now we have one less company making consumer DRAM.


The people who occupy the b2b ram buying kind of jobs are not aliens from another planet. Brand awareness in consumer markets, especially ones that are so closely tied to people's jobs (nerds gonna nerd) is going to have a knock on effect. It's not like a clothing brand or something.


Sometimes reputation and suchlike in the consumer market can directly boost your B2B business. Consumers and professionals alike will look at backblaze drive reliability figures.

Other times professionals will sneer at a consumer product, or a consumer product can diminish your brand. Nobody's wiring a data centre with Monster Cables, and nobody's buying Cisco because they were impressed by Linksys.


Not that it invalidates your point, but Cisco sold Linksys in 2013.


Yes, but the consumer brand has to have a good reputation for that to pan out positively in B2B. Crucial has a decent reputation, but the problem is that there hasn't been any innovation in the consumer DRAM market for 2 decades that wasn't driven by/copied from the enterprise sector. The difference between a Crucial DIMM and a Micron Unbuffered dimm is which brands sticker they put on it, and maybe a heatsink and tighter binning/QA. That's not unique to Micron/Crucial. Aside from "Moar RGB", what innovation has happened in this space in the consumer side of things that isn't just a mirror of the enterprise side (eg DDR4 to DDR5)? XPO/XMP? That's Intel/AMD dictating things to DRAM companies. So what impression really are people meant to carry over from Crucial to Micron in this instance? How is Micron meant to leverage the Crucial brand in this space to stand out above others?

Similar story on the SSD side of things regarding reputation/innovation, especially when you consider that Crucial SSD's are no more "micron" in a hardware sense than a Corsair one built using Micron flash (support is a different matter), as the controllers were contracted out to 3rd parties (Phison) and the flash used was entry level/previous gen surplus compared to what's put in enterprise. The demands and usecase for consumers and even prosumers/enthusiasts are very different and in general substantially less than on the enterprise side of things with SSDs, and that gulf is only growing wider. So again, what is meant to carry over? How can Micron leverage Crucial to stand out when the consumer market just doesn't have the demands to support them making strong investment to stand out?

Frankly, taking what you say farther, I think if this is what they want to do (having consumer brand recognition that can carry over in some meaningful way to B2B), then sundowning crucial now (given the current supply issues) and then eventually re-entering the market when things return to some sense of "normal" as Micron so that both consumer and enterprise brands are the same brand, "Micron", makes much more sense.


Well unless there's some ghost-like life form in a gas state, we sort of need the molecules to stay together to form life.


Obligatory KRAZAM video on microservices https://www.youtube.com/watch?v=y8OnoxKotPQ


Or 1300 horsepower


That's just a rough conversion for 1 megawatt. I was curious about the actual voltage and current they were using, since the article didn't say much.


The values posted above cannot be far from the real values.

Perhaps they use something like 700 A at 1400 V.

It is unlikely that they use a voltage over 1500 V, because the semiconductor switches used in the converter become much more expensive at higher voltages. A current of 700 A or even 1000 A can be easily handled by a single IGBT module. There are much bigger semiconductor switches that can handle several thousand ampere currents.


I should have been clearer, I did a web search and found other articles, which based on marketing information from the company, say it's 1000A and 1000V.

https://electrek.co/2025/03/17/byd-confirms-1000v-super-e-pl...


Sir, M3 means something else in the car world


And yet, you perfectly understood what they meant, because words can have different meanings dependent on the context.

Luckily, GP made it abundantly clear he wasn’t talking about a beemer.


It was definitely not perfectly clear. I was very confused when I first read the comment.

If we were talking about laptops I would know M3 refers to the processor, but there's literally a car named "M3" that's been popular for nearly 40 years.


> Luckily, GP made it abundantly clear he wasn’t talking about a beemer

He did not, nor did he make it clear - certainly not abundantly so - what they WERE talking about, which is the core and more important problem.

For example, just because it's abundantly clear they weren't talking about a Boeing 747 doesn't mean I have any idea what they were on about.


eee egree, duin ber minimoom to bee oonderstuut is olweis bete bikoz it seiv taim


Nvida has a P/E of 47. While it may be a bit high for a semiconductor company, it's definitely not a meme stock figure.


The forward PE is half that based on real numbers of future orders reported in company reports.


Yes and no, going from 47 to 50 would buy a few of the most popular meme stocks so there simply aren't enough people to make it a true meme stock with that market cap.


I sometimes wonder how many talented engineers top colleges are rejecting because they were busy working on real engineering projects like this than academics and test scores.


Probably not a lot, kids who have the grit to work on projects like this are the ones most likely to succeed academically


Unless they are forced to learn things that are uninteresting to them. I almost failed the high school entry exams because I dedicated more time to soldering electronic devices and programming computers rather than writing essays about Polish literature or memorizing dates of historical battles. Same thing with the final high school exams - it was a really close call. I felt like they gave me good scores on non-STEM subjects just because I already won some prizes in electronics / physics olympiads and brought some fame to the school, so kida got away with that but... it was stressful anyways.


Man, you just triggered me. This was also me in school.

I even have a huge interest in history, but I remember my first history exam on World War 1. I was ready to answer questions on its causes, the people, how industrial war changed the nature of fighting, the new countries that formed after the war... First Question: What was the date the Serbian nationalist Gavrillo Princip assassinated Archduke Franz Ferdinand. Second Question: What was the dates each country declared war...

It also took me years to actually sit down and read JRR Tolkien as we read the Hobbit as a class book in grade 8. First question for the test: List the names of the 13 dwarves that attended the party at Bilbo's house (1 point each for a test out of 30 IIRC).


Holy crap - I have read the Hobbit many times (and LoTR a few less) and I would never have taken the time to commit 13-character names to memory - most of them simply were not that memorable.


The rhyming sets are a bit of a crutch, at least (Fili + Kili, Óin + Glóin, Bifur + Bofur + Bombur, etc). But you're right - most of the dwarves are individually forgettable. Only two are substantially characterized - Thorin the leader, and Bombur the comically fat.


Heh - exactly - the only truly easy character name that has always stuck in my mind was from "Snow Crash", I mean - who can forget "Hiro Protagonist"...?


[flagged]


Discipline and obligation refer to things that materially matter to the people around us, and to society - not rote memorization of pointless facts.

And intelligence is just as much about identifying and applying effort towards useful goals - your "if you're so smart" is anything but.


But doesn't your global liberal democratic society want you to be trained and proficient in productive tasks which might only appear boring through a superficial lens or perspective? Or is this world's education entirely motivated by the selfish desire for pleasure and leisure? Rather than being founded to serve hard work ethic principles and effective programs that maybe can help build decent societies?


I think that's a poor way of framing it.

If the work is genuinely worthwhile, and the people who do it are respected, there will be people to do it.

Teaching people to suffer through work without any apparent reason - that's something capitalist society wants, not liberal democracy.


[flagged]



You hit it right on the head, I think.

Even at my own university, I struggle to maintain a 3.0 GPA while at the same time actively tutoring students for the very courses I'm failing.

The issue isn't knowledge or competency, it's a mix of work ethic and tolerance for menial busywork.

I think some of us just aren't made for the academia grind...


It's true, and okay, that the academia grind is only for a subset of us. It is not the only meaningful path! I went on to gradschool by rote, and I do not push it on my high-school students or anyone else. It took me about 40 years to find a sense of purpose (having a child was the catalyst). Sadly, the push for STEM seems motivated by capitalists wanting further control of valuable labor, so I'm really chuffed by Bryan's Show HN post- even though open-source can be leveraged by capital, it doesn't have to be. It is a non-walled-garden model, and an example of what we can do collectively. Even if the Linux kernel is largely funded by corporations, it doesn't have to be.

A concern is that a laptop is still not something my community can make with the local resources, and thus the exploitation of land, labor, and money continues.

What would a fair-trade laptop cost?


> Unless they are forced to learn things that are uninteresting to them.

This really resonates with me. I love math now, but absolutely loathed it in high school. The curriculum lacked any sort of way to apply math to real problems. I simply cannot learn things in the abstract like that. It's like learning a programming language without ever building a program.


Same. I stopped "caring" about math when we started to learn polynomials. Binomials..ok. Trinomials...ok. But then it just became repetitive when the class was just adding more terms to the functions that over the semester I ended up spending most of the class daydreaming.


I disagree; I did similar projects like this in high school (not exactly like this; his is a true achievement). I did very well grade-wise and had a high GPA but I bombed the SAT because I didn’t understand that you didn’t lose the same number of points for questions you skipped. So the ones I didn’t have time to answer I just randomly selected, which resulted in a poor score.

I found out later:

1. How SAT scoring works 2. That you shouldn’t take the last SAT of the year since then you cannot retake it 3. I probably should’ve taken the ACT instead

I wish they’d prepared us in school for this, but they were too busy training us for standardized state testing since that determined their own budget.

Could I have gotten into MIT? Unsure; back at 18 I didn’t know MIT existed and this was early Internet times. It would have been nice if my high school mentioned it as an option.

In my case at least, doing projects like this and getting good grades didn’t automatically turn into attending any college I wanted. Either way, I ended up with a great career.

Anyways, kudos to the person who made this project!


Thankfully, the SAT no longer deducts points for wrong answers. But I agree, there's a big difference between testing and doing really great work.

I'm somewhat on the other end of this, where I excelled in school, graduated valedictorian, but didn't gain any meaningful experience with projects and such and had poor leadership skills all around.


I’ve known few exemplars like this one. But at least 2. One made a flight simulator for 737 in the backyard that was used regularly by airline pilots to train. The other made a complete discrete FM stereo transmitter, mounted his own radio later. He was 16, and it was the early 90. So all from books.

Both guys brutally failed in the first year in the University. They dis not like theory, they wanted to make.

So… i dunno. 2 reference points there.


Unless you aren't fit for traditional academic learning models.

I spent most of my young adulthood working on projects (not nearly as insanely technical as this! but) similar to this. But I dropped out of high school, didn't go to college, because none of them would teach me in a way, or a pace, that fit my learning disability or mental models. Luckily I had the drive to teach myself, and built a successful two-decade career, despite my parents and teachers telling me I'd fail and become homeless.

High school kids have insane potential, and can achieve truly amazing things. But often people disregard them and don't set them up for success. So many companies could hire really great engineers, even from high school, if they could just find the motivated ones and put them in a mentorship/apprenticeship program that aligned with their interests and ways of learning.


You really don’t want to see my pre-university grades.

I was on a mission, and I can’t do two things at once. So school was about efficiency. I got great grades wherever that took low effort. That only went so far.

After graduation, nowhere I wanted to be would have looked at me.

It took me a couple years after high school to find the right university, but my personal projects paid off.

Looking back, it was a gamble. But you don’t really choose those kinds of paths.


I dunno. I only succeeded as a kid academically because of literally my IQ not because I had grit learnt from my projects. I pathologically hated being told what to do so the determination to do my own projects did not translate into anything assigned to me.


Going from how many gifted children end up underperforming because they are made to do stupid things & then getting labeled as difficult or slow: a lot more then you'd think.

Being talented and gifted is generally not appreciated, not even in academia. Many of the most talented people never finish their education because academia is more about playing the game & having the grit (or lack of backbone?) to deal with the bullshit and do what you are told.

And tbf, the best engineers I know are not necessarily the most talented ones, but those that developed the grit to push through the bs.


[flagged]


Would you please stop taking threads on flamewar tangents? Your comments in this thread have been inflammatory and offtopic. That's not what this site is for, and destroys what it is for.

If you wouldn't mind reviewing https://news.ycombinator.com/newsguidelines.html and taking the intended spirit of the site more to heart, we'd be grateful.


Well, sincerely, my objective here is to spark thought provoking discussion, which should lead to personal cognitive improvements. But I must admit that it's kinda not my fault that the format I chose is really provocative.

edit: And doesn't that defeat the purpose of a nerd-rich forum like Hacker News? The stifling of creativity and abstract thoughts? I came here to socialize and find bright minds, no offense.


I'm not worried about whose fault it is, I just need you to stop posting like this to Hacker News. As I said, it's not what this site is for, and destroys what it is for. And it's particularly dismaying to see in a thread such as this one.


Some of the side effects of semaglutide are just a result of eating less calories.

Without a control group who also ate the same amount of calories but without the drug, it's hard to know if the side effect were directly caused by semaglutide or just a result of being in a calorie deficit.


well it does lead to less eating so it indeed a side effect. if control group ate the same amount there would be no weight loss to begin with.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: