Why?
In my opinion Linux desktop environments are terrible compared to Windows.
How's the display scaling these days? Is it still a better experience to run a 4k monitor at a lower resolution?
What's the Nvidia driver situation? Still janky because their drivers are doing their own thing?
I just switched to a 4K monitor last week. Set display scaling to 1.5x in the KDE settings, logged in again, and everything looks great and scales cleanly. I haven't noticed any weird artifacts or bizarre UIs yet. It just works.
Except Spotify, that needs a command line flag to set the scale factor, but that app is well known to be half-assed on Linux (they also don't support input methods, so searching for Japanese songs is a copy and paste exercise) and that's not Linux's fault.
AIUI the nvidia drivers are a lot better these days, but most Linux users, myself included, know to stay away from nvidia unless you have very good reasons not to. AMD cards work beautifully.
Because in my opinion Windows is terrible :) For many reasons.
Linux users don't use Nvidia if they are interested in the modern desktop use case. That's a well known factor. If someone migrates to Linux using Nvidia, chances are high they'll change it to AMD on the next GPU upgrade.
> Linux users don't use Nvidia if they are interested in the modern desktop use case.
Which rules out anyone who wants to game or do CUDA stuff.
Everyone is welcome to their own opinions and preferences, but if you ask me, if the response to a request to use the most powerful/performant graphics cards is to switch to AMD (and AMD has some good cards but Nvidia’s are better and OpenCL can’t compete with CUDA when it comes to any machine learning work), well, that’s part of why Linux’s modern desktop adoption is still so small.
If the only option is to use an AMD GPU, you might as well just get a Mac and use actual UNIX.
And honestly, to each their own! But you asked why anyone would use WSL2 and you’ve got a good answer: they want to be able to take advantage of their chosen hardware and access the various Linux tools.
I didn't see an answer that explains how WSL is better than Linux proper, at least not in case when you don't care about Windows itself.
AMD is fine for gaming, I'm using 5700XT on Linux for playing games. And AMD will match Nvidia higher end cards next month. So I don't see any reason to use Nvidia for that.
WSL offers nothing for gaming or similar use cases that regular Linux can't. If you need to use CUDA with Nvidia hardware, you can do it on Linux proper just fine, you don't need WSL for it - Nvidia provide support.
Yes, I’m aware that Nvidia supports Linux for CUDA. Linux is a very popular headless environment for this reason.
I was responding to your response that Nvidia drivers for HiDPi and other display issues are subpar with “well, everyone who is serious about using Linux on the desktop uses AMD.”
First, that’s not true (as evidenced by the many people who do CUDA workloads in Linux). Second, my overarching point is that it’s strikes me as being really dismissive to say “well just don’t use the hardware you like/want/need if you want a good Linux on the desktop experience.”
Nvidia's problems are holding the progress of Linux desktop back, so I totally recommend avoiding it for anyone who is using Linux already for that reason alone (besides various other reasons). But it is usable, just your use cases will be more limited. Performance when it works is OK.
How's the display scaling these days? Is it still a better experience to run a 4k monitor at a lower resolution? What's the Nvidia driver situation? Still janky because their drivers are doing their own thing?