Virtually nobody needs a 3090, much less a gamer (let's be honest, though, many will buy one regardless). For the people that actually do need that horsepower, it's unbelievably cheap for what you are getting. You could have easily paid twice that a year ago for less.
> Virtually nobody needs a 3090, much less a gamer
I'm tempted to get one just to avoid having to think about upgrading a graphics card for another 10 years. Plus I can do some ML messing about as well for resume-driven development.
Best I can get for $500 now (~£377) is an 8GB GTX2060 - 5x fewer cores, 1/3rd the RAM, 2/3rd the memory bandwidth of the GTX3090 which is £1399. Plus I really don't want to upgrade my PC again for at least 5 years - just done that and been reminded of why I hate it.
> Virtually nobody needs a 3090, much less a gamer (let's be honest, though, many will buy one regardless).
That statement is absolutely true regarding me. Honestly, I don't need a gaming system at all, let alone one with such a powerful GPU. But even in terms of my leisure-time gaming, I could never justify the price difference over a much cheaper card.
Still, I could imagine it making a lot more sense for other people. E.g., pro gamers, people with big entertainment budgets, or people using CUDA for number-crunching.
Well yeah, that's just how it is because of progress. It's still more expensive than last year's XX102 chip. That said, it's only about 50% faster than a 2080Ti at everything except ray-tracing. 50% faster, 40% more expensive.
Back in Kepler, a 780Ti was 800$, and it had the GK110 chip, which was the full-fat chip. Now, the cut-down chip costs twice as much.