Why do you need one of those as a gamer? 1080ti was 120+ fps in heavy realistic looking games. 20xx RT slashed that back to 15 fps, but is RT really necessary to play games? Who cares about real-world reflections? And reviews showed that RT+DLSS introduced so many artefacts sometimes that the realism argument seemed absurd.
Any modern card under $1000 is more than enough for graphics in virtually all games. The gaming crisis is not in a graphics card market at all.
A bunch of new games are RT-only. Nvidia has aggressively marketed on the idea that RT, FG, and DLSS are "must haves" in game engines and that 'raster is the past'. Resolution is also a big jump. 4K 120Hz in HDR is rapidly becoming common and the displays are almost affordable (esp. so for TV-based gaming).
In fact, as of today, Even the very fastest RTX 4090 cannot run CP2077 at max non-RT settings and 4K at 120fps.
Now, I do agree that $1000 is plenty for 95% of gamers, but for those who want the best, Nvidia is pretty clearly holding out intentionally. The gap between a 4080TI and a 4090 is GIANT. Check this great comparison from Tom's Hardware: https://cdn.mos.cms.futurecdn.net/BAGV2GBMHHE4gkb7ZzTxwK-120...
The biggest next-up offering leap on the chart is 4090.
I'm an ex-gamer, pretty recent ex-, and I own 4070Ti currently (just to show I'm not a grumpy GTX guy). Max settings are nonsensical. You never want to spend 50% of frame budget on ASDFAA x64. Lowering AA alone to barely noticeable levels makes a game run 30-50% faster*. Anyone who chooses a graphics card may watch benchmarks and basically multiply FPS by 1.5-2 because that's what playable settings will be. And 4K is a matter of taste really, especially in "TV" segment where it's a snakeoil resolution more than anything else.
* also you want to ensure your CPU doesn't C1E-power-cycle every frame and your frametimes don't look like EKG. There's much more to performance tuning than just buying a $$$$$ card. It's like installing a V12 engine into a rusted fiat. If you want performance, you want RTSS, AB, driver settings, bios settings, then 4090.
Many people are running 4k resolution now, and a 4080 struggles to to break 100 frames in many current games maxed (never-mind future titles) - therefore there's plenty of a market with gamers and the 5x series (myself included) who are looking for closer to 4090 performance at a non obscene price.
This is just absolutely false, Steam says that 4.21% of users play at 4K. The number of users that play at higher than 1440p is only 10.61%. So you are wrong, simply wrong.
This is a chicken and egg thing, though - people don't play at 4K because it requires spending a lot of $$$ on top-of-the-line GPU, not because they don't want to.
Did I say all the people, or did I say many people?..
Why are you so hostile? I'm not justifying the cost, I'm simply in the 4k market and replying to OP's statement "Any modern card under $1000 is more than enough for graphics in virtually all games" which is objectively false if you're a 4k user.
1. Because you shoot at puddles?
2. Because you play at night after a rainstorm?
Really, these are the only 2 situations where ray tracing makes much of a difference. We already have simulated shadowing in many games and it works pretty well, actually.
Yes, actually. A lot of games use water, a lot, in their scenes (70% of the planet is covered in it, after all), and that does improve immersion and feels nice to look at.
Silent Hill 2 Remake and Black Myth: Wukong both have a meaningful amount of water in them and are improved visually with raytracing for those exact reasons.
Can you please point at the mentioned effects here? Immersion in what? Looks like PS4-gen Tomb Raider to me, honestly. All these water reflections existed long before RTX, it didn't introduce reflective surfaces. What it did introduce is dynamic reflections/ambience, which are a very specific thing to be found in the videos above.
does improve immersion and feels nice to look at
I bet that this is purely synthetic because RTX gets pushed down the players throat by not implementing any RTX-off graphics at all.
> by not implementing any RTX-off graphics at all.
Just taking this one, you actually make a point about having a raytracing-ready graphics card for me. If all the games are doing the hard and mathematically taxing reflection and light-bouncing work through raytracing now and without even an option for non-raytraced, then raytracing is where we're going and having a good RT card is, now or very soon, a requirement.
It’s not me making this point, but nvidia’s green paper agreements with particular studios to milk you for more money for basically same graphics we had at TR:ROTT. If you’re fine with that, godspeed. But “we” are not going anywhere RT “now”. Most of Steam plays on xx60 and equivalents, which cannot reasonably run RT-only, so there’s no natural incentive to go there.
Indeed you're not a gamer, but you're the target audience for gaming advertisements and $2000 GPUs.
I still play traditional roguelikes from the 80s (and their modern counterparts) and I'm a passionate gamer. I don't need a fancy GPU to enjoy the masterpieces. Because at the end of the day nowhere in the definition of "game" is there a requirement for realistic graphics -- and what passes off as realistic changes from decade to decade anyway. A game is about gameplay, and you can have great gameplay with barely any graphics at all.
I'd leave raytracing to those who like messing with GLSL on shadertoy; now people like me have 0 options if they want a good budget card that just has good raster performance and no AI/RTX bullshit.
And ON TOP OF THAT, every game engine has turned to utter shit in the last 5-10 years. Awful performance, awful graphics, forced sub-100% resolution... And in order to get anything that doesn't look like shit and runs at a passable framerate, you need to enable DLSS. Great
> Any modern card under $1000 is more than enough for graphics in virtually all games
I disagree. I run a 4070 Super, Ryzen 7700 with DDR5 and I still cant run Asseto Corsa Competizione in VR at 90fps. MSFS 2024 runs at 30 something fps at medium settings. VR gaming is a different beast
Spending $2 quadrillion on a GPU won't fix poor raster performance which is what you need when you're rendering two frames side by side. Transistors only get so small before AI slop is sold as an improvement.
Me. I do. I *love* raytracing; and, as has been said and seen for several of the newest AAA games, raytracing is no longer optional for the newest games. It's required, now. Those 1080s, wonderful as long as they have been (and they have been truly great cards) are definitely in need of an upgrade now.
You need as much FPS as possible for certain games for competitive play like Counter Strike.
I went from 80 FPS (highest settings) to 365 FPS (capped to my alienware 360hz monitor) when I upgraded from my old rig (i7-8700K and 1070GTX) to a new one ( 7800X3D and 3090 RTX)
You really want low latency in competitive shooters. From mouse, to game engine, to drivers, to display. There's a lot of nuance to this area, which hardware vendors happily suggest to just throw money at.
Btw, if you're using gsync or freesync, don't allow your display to cap it, keep it 2-3 frames under max refresh rate. Reddit to the rescue.
It's a leisure activity, "necessary" isn't the metric to be used here, people clearly care about RT/PT while DLSS seems to be getting better and better.
The 3090 + 5900x is a mistake. The 5900x is 2 x 5600x CPUs. So therefore, when the games asks for 8 cores, it will get 6 good cores and 2 very slow cores across the infinity switching fabric. What's more, NVidia GPUs take MUCH MORE CPU than AMD GPUs. You should either buy an AMD GPU or upgrade/downgrade to ANYTHING OTHER THAN 5900x with 8+ cores (5800x, 5800, 5700, 5700x3d, 5950x, 5900xt, anything really ...)
Any modern card under $1000 is more than enough for graphics in virtually all games. The gaming crisis is not in a graphics card market at all.