Not that I agree with the pardons, but former presidents are usually old. Letting your political opponent die in prison can have a massive backlash so most presidents would rather not let that happen.
I'm half joking but if this AI boom continues we're going to see Nvidia exit from consumer GPU business. But Jensen Huang will never do that to us... (I hope)
on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.
basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!
we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.
Why would anyone sell a handful of GPUs to nobodies like us when they could sell a million GPUs for thousands apiece to a handful of big companies? We're speedrunning the absolute worst corpo cyberpunk timeline.
Might almost be a good thing, if it means abandoning overhyped/underperforming high-end game rendering tech, and taking things in a different direction.
The push for 4K with raytracing hasn't been a good thing, as it's pushed hardware costs way up and led to the attempts to fake it with AI upscaling and 'fake frames'. And even before that, the increased reliance on temporal antialiasing was becoming problematic.
The last decade or so of hardware/tech advances haven't really improved the games.
DLSS Transformer models are pretty good. Framegen can be useful but has niche applications dure to latency increase and artifacts. Global illumination can be amazing but also pretty niche as it's very expensive and comes with artifacts.
Biggest flop is UE5 and it's lumen/nanite. Reallly everything would be fine if not that crap.
And yeah, our hardware is not capable of proper raytracing at the moment.
The push for ray tracing comes from the fact that they've reached the practical limits of scaling more conventional rendering. RT performance is where we are seeing the most gen-on-gen performance improvement, across GPU vendors.
Poor RT performance is more a developer skill issue than a problem with the tech. We've had games like Doom The Dark Ages that flat out require RT, but the RT lighting pass only accounts for ~13% of frame times while pushing much better results than any raster GI solution would do with the same budget.
The literal multi-million dollar question that executives have never bothered asking: When is it enough?
Do I, as a player, appreciate the extra visual detail in new games? Sure, most of the time.
But, if you asked me what I enjoy playing more 80% of the time? I'd pull out a list of 10+ year old titles that I keep coming back to, and more that I would rather play than what's on the market today if they only had an active playerbase (for multiplayer titles).
Honestly, I know I'm not alone in saying this: I'd rather we had more games focused on good mechanics and story, instead of visually impressive works that pile on MTX to recoup insane production costs. Maybe this is just the catalyst we need to get studios to redirect budgets to making games fun instead of spending a bunch of budget on visual quality.
Well in the case of Doom: The Dark ages, it's not just about about fidelity but about scale and production. To make TDA's levels with the baked GI used in the previous game would have taken their artists considerably more time and resulted in a 2-3x growth in install size, all while providing lighting that is less dynamic. The only benefit would have been the ability to support a handful of GPUs slightly older than the listed minimum spec.
Ray tracing has real implications not just for the production pipeline, but the kind of environments designers can make for their games. You really only notice the benefits in games that are built from the ground up for it though. So far, most games with ray tracing have just tacked it on top of a game built for raster lighting, which means they are still built around those limitations.
I'm not even talking about RT, specifically, but overall production quality. Increased texture detail, higher-poly models, more shader effects, general environmental detail, the list goes on.
These massive production budgets for huge, visually detailed games, are causing publishers to take fewer creative risks, and when products inevitably fail in the market the studios get shuttered. I'd much rather go back to smaller teams, and more reasonable production values from 10+ years ago than keep getting the drivel we have, and that's without even factoring in how expensive current hardware is.
I can definitely agree with that. AAA game production has become bloated with out of control budgets and protracted development cycles, a lot of that due to needing to fill massive overbuilt game worlds with an endless supply of unique high quality assets.
Ray tracing is a hardware feature that can help cut down on a chunk of that bloat, but only when developers can rely on it as a baseline.
They are already making moves that might suggest that future. They are going to stop packaging VRAM with their GPUs shipped to third-party graphics card makers, who will have to source their own, probably at higher cost.
They will constrain supply before exiting. It's just not smart exiting, you can stop developing and it will be a trickle, also will work as insurance in case AI flops.
Honestly, I'd prefer it. It might get AMD and Intel more off their ass for GPU development. I already stopped buying Nvidia gpus ages ago before they saw value in the Linux/Unix market, and I'm tired of them sucking up all the air in the room.
They did get burned when crypto switched to dedicated hardware and nvida were left with for them huge surpluses of 10xx series hardware. But what they’re selling to AI companies now is a lot more different from their consumer gear
The brand aware "consumers" are really just DIY PC builders, which is relatively a small number. Enterprise DRAM business is doing so great that Micron just doesn't see the consumer market is worth chasing.
This is bad for consumers though since DRAM prices are skyrocketing and now we have one less company making consumer DRAM.
The people who occupy the b2b ram buying kind of jobs are not aliens from another planet. Brand awareness in consumer markets, especially ones that are so closely tied to people's jobs (nerds gonna nerd) is going to have a knock on effect. It's not like a clothing brand or something.
Sometimes reputation and suchlike in the consumer market can directly boost your B2B business. Consumers and professionals alike will look at backblaze drive reliability figures.
Other times professionals will sneer at a consumer product, or a consumer product can diminish your brand. Nobody's wiring a data centre with Monster Cables, and nobody's buying Cisco because they were impressed by Linksys.
Yes, but the consumer brand has to have a good reputation for that to pan out positively in B2B. Crucial has a decent reputation, but the problem is that there hasn't been any innovation in the consumer DRAM market for 2 decades that wasn't driven by/copied from the enterprise sector. The difference between a Crucial DIMM and a Micron Unbuffered dimm is which brands sticker they put on it, and maybe a heatsink and tighter binning/QA. That's not unique to Micron/Crucial. Aside from "Moar RGB", what innovation has happened in this space in the consumer side of things that isn't just a mirror of the enterprise side (eg DDR4 to DDR5)? XPO/XMP? That's Intel/AMD dictating things to DRAM companies. So what impression really are people meant to carry over from Crucial to Micron in this instance? How is Micron meant to leverage the Crucial brand in this space to stand out above others?
Similar story on the SSD side of things regarding reputation/innovation, especially when you consider that Crucial SSD's are no more "micron" in a hardware sense than a Corsair one built using Micron flash (support is a different matter), as the controllers were contracted out to 3rd parties (Phison) and the flash used was entry level/previous gen surplus compared to what's put in enterprise. The demands and usecase for consumers and even prosumers/enthusiasts are very different and in general substantially less than on the enterprise side of things with SSDs, and that gulf is only growing wider. So again, what is meant to carry over? How can Micron leverage Crucial to stand out when the consumer market just doesn't have the demands to support them making strong investment to stand out?
Frankly, taking what you say farther, I think if this is what they want to do (having consumer brand recognition that can carry over in some meaningful way to B2B), then sundowning crucial now (given the current supply issues) and then eventually re-entering the market when things return to some sense of "normal" as Micron so that both consumer and enterprise brands are the same brand, "Micron", makes much more sense.
The values posted above cannot be far from the real values.
Perhaps they use something like 700 A at 1400 V.
It is unlikely that they use a voltage over 1500 V, because the semiconductor switches used in the converter become much more expensive at higher voltages. A current of 700 A or even 1000 A can be easily handled by a single IGBT module. There are much bigger semiconductor switches that can handle several thousand ampere currents.
I should have been clearer, I did a web search and found other articles, which based on marketing information from the company, say it's 1000A and 1000V.
It was definitely not perfectly clear. I was very confused when I first read the comment.
If we were talking about laptops I would know M3 refers to the processor, but there's literally a car named "M3" that's been popular for nearly 40 years.
Yes and no, going from 47 to 50 would buy a few of the most popular meme stocks so there simply aren't enough people to make it a true meme stock with that market cap.
I sometimes wonder how many talented engineers top colleges are rejecting because they were busy working on real engineering projects like this than academics and test scores.
Unless they are forced to learn things that are uninteresting to them.
I almost failed the high school entry exams because I dedicated more time to soldering electronic devices and programming computers rather than writing essays about Polish literature or memorizing dates of historical battles. Same thing with the final high school exams - it was a really close call. I felt like they gave me good scores on non-STEM subjects just because I already won some prizes in electronics / physics olympiads and brought some fame to the school, so kida got away with that but... it was stressful anyways.
Man, you just triggered me. This was also me in school.
I even have a huge interest in history, but I remember my first history exam on World War 1. I was ready to answer questions on its causes, the people, how industrial war changed the nature of fighting, the new countries that formed after the war... First Question: What was the date the Serbian nationalist Gavrillo Princip assassinated Archduke Franz Ferdinand. Second Question: What was the dates each country declared war...
It also took me years to actually sit down and read JRR Tolkien as we read the Hobbit as a class book in grade 8. First question for the test: List the names of the 13 dwarves that attended the party at Bilbo's house (1 point each for a test out of 30 IIRC).
Holy crap - I have read the Hobbit many times (and LoTR a few less) and I would never have taken the time to commit 13-character names to memory - most of them simply were not that memorable.
The rhyming sets are a bit of a crutch, at least (Fili + Kili, Óin + Glóin, Bifur + Bofur + Bombur, etc). But you're right - most of the dwarves are individually forgettable. Only two are substantially characterized - Thorin the leader, and Bombur the comically fat.
Heh - exactly - the only truly easy character name that has always stuck in my mind was from "Snow Crash", I mean - who can forget "Hiro Protagonist"...?
But doesn't your global liberal democratic society want you to be trained and proficient in productive tasks which might only appear boring through a superficial lens or perspective? Or is this world's education entirely motivated by the selfish desire for pleasure and leisure? Rather than being founded to serve hard work ethic principles and effective programs that maybe can help build decent societies?
It's true, and okay, that the academia grind is only for a subset of us. It is not the only meaningful path! I went on to gradschool by rote, and I do not push it on my high-school students or anyone else. It took me about 40 years to find a sense of purpose (having a child was the catalyst).
Sadly, the push for STEM seems motivated by capitalists wanting further control of valuable labor, so I'm really chuffed by Bryan's Show HN post- even though open-source can be leveraged by capital, it doesn't have to be. It is a non-walled-garden model, and an example of what we can do collectively. Even if the Linux kernel is largely funded by corporations, it doesn't have to be.
A concern is that a laptop is still not something my community can make with the local resources, and thus the exploitation of land, labor, and money continues.
> Unless they are forced to learn things that are uninteresting to them.
This really resonates with me. I love math now, but absolutely loathed it in high school. The curriculum lacked any sort of way to apply math to real problems. I simply cannot learn things in the abstract like that. It's like learning a programming language without ever building a program.
Same. I stopped "caring" about math when we started to learn polynomials. Binomials..ok. Trinomials...ok. But then it just became repetitive when the class was just adding more terms to the functions that over the semester I ended up spending most of the class daydreaming.
I disagree; I did similar projects like this in high school (not exactly like this; his is a true achievement). I did very well grade-wise and had a high GPA but I bombed the SAT because I didn’t understand that you didn’t lose the same number of points for questions you skipped. So the ones I didn’t have time to answer I just randomly selected, which resulted in a poor score.
I found out later:
1. How SAT scoring works
2. That you shouldn’t take the last SAT of the year since then you cannot retake it
3. I probably should’ve taken the ACT instead
I wish they’d prepared us in school for this, but they were too busy training us for standardized state testing since that determined their own budget.
Could I have gotten into MIT? Unsure; back at 18 I didn’t know MIT existed and this was early Internet times. It would have been nice if my high school mentioned it as an option.
In my case at least, doing projects like this and getting good grades didn’t automatically turn into attending any college I wanted. Either way, I ended up with a great career.
Anyways, kudos to the person who made this project!
Thankfully, the SAT no longer deducts points for wrong answers. But I agree, there's a big difference between testing and doing really great work.
I'm somewhat on the other end of this, where I excelled in school, graduated valedictorian, but didn't gain any meaningful experience with projects and such and had poor leadership skills all around.
I’ve known few exemplars like this one. But at least 2. One made a flight simulator for 737 in the backyard that was used regularly by airline pilots to train. The other made a complete discrete FM stereo transmitter, mounted his own radio later. He was 16, and it was the early 90. So all from books.
Both guys brutally failed in the first year in the University. They dis not like theory, they wanted to make.
Unless you aren't fit for traditional academic learning models.
I spent most of my young adulthood working on projects (not nearly as insanely technical as this! but) similar to this. But I dropped out of high school, didn't go to college, because none of them would teach me in a way, or a pace, that fit my learning disability or mental models. Luckily I had the drive to teach myself, and built a successful two-decade career, despite my parents and teachers telling me I'd fail and become homeless.
High school kids have insane potential, and can achieve truly amazing things. But often people disregard them and don't set them up for success. So many companies could hire really great engineers, even from high school, if they could just find the motivated ones and put them in a mentorship/apprenticeship program that aligned with their interests and ways of learning.
You really don’t want to see my pre-university grades.
I was on a mission, and I can’t do two things at once. So school was about efficiency. I got great grades wherever that took low effort. That only went so far.
After graduation, nowhere I wanted to be would have looked at me.
It took me a couple years after high school to find the right university, but my personal projects paid off.
Looking back, it was a gamble. But you don’t really choose those kinds of paths.
I dunno. I only succeeded as a kid academically because of literally my IQ not because I had grit learnt from my projects. I pathologically hated being told what to do so the determination to do my own projects did not translate into anything assigned to me.
Going from how many gifted children end up underperforming because they are made to do stupid things & then getting labeled as difficult or slow: a lot more then you'd think.
Being talented and gifted is generally not appreciated, not even in academia. Many of the most talented people never finish their education because academia is more about playing the game & having the grit (or lack of backbone?) to deal with the bullshit and do what you are told.
And tbf, the best engineers I know are not necessarily the most talented ones, but those that developed the grit to push through the bs.
Would you please stop taking threads on flamewar tangents? Your comments in this thread have been inflammatory and offtopic. That's not what this site is for, and destroys what it is for.
Well, sincerely, my objective here is to spark thought provoking discussion, which should lead to personal cognitive improvements. But I must admit that it's kinda not my fault that the format I chose is really provocative.
edit: And doesn't that defeat the purpose of a nerd-rich forum like Hacker News? The stifling of creativity and abstract thoughts? I came here to socialize and find bright minds, no offense.
I'm not worried about whose fault it is, I just need you to stop posting like this to Hacker News. As I said, it's not what this site is for, and destroys what it is for. And it's particularly dismaying to see in a thread such as this one.
Some of the side effects of semaglutide are just a result of eating less calories.
Without a control group who also ate the same amount of calories but without the drug, it's hard to know if the side effect were directly caused by semaglutide or just a result of being in a calorie deficit.
reply