Indeed. Arch-based distros ought to be managed by intermediate-advanced users. Linux Mint is better suited to beginners.
> but I really just want the PC to work for playing games without any issues.
If you decide to give it another go, and you have the means, I suggest using an AMD graphics card. Nvidia's drivers are notorious for being troublesome on Linux, and although they can usually be made to work (either by the user or by a distro developer), the drivers for AMD GPUs are much better integrated with the OS.
I switched to AMD a few years ago and have been very pleased with the results, both in games and in non-gaming tasks. (I don't use my GPU for LLM development, though, so I can't speak to the current state of things in that area.)
> Wayland is not working and will never work because the underlying ideas and goals do not align with that of a desktop.
Can you elaborate on this?
I don't use Wayland because it lacks something I need (unprivileged Scroll Lock LED control) but I'm curious about what else keeps people from using it.
Debian Stable is my distro of choice these days, mainly because it respects my time by avoiding frivolous changes, without getting in my way when I want to change specific things.
Setting it up for modern gaming hardware required a couple of extra steps, which I found to be worthwhile. I now have a system that has proved dependable whenever I need to get work done immediately, and very capable whenever I just want to have fun. (The Backports repository makes bridging that gap easy in most cases.)
Linux Mint is what I suggest to new users. It's based on the widely supported Ubuntu distro, has a good sized community, and seems aimed at people who don't already know unix. It also makes a point of stripping out problematic Ubuntu-isms, and has a Debian-based edition waiting in the wings in case that ever becomes unmanageable.
My desktop environment is KDE Plasma, which you can install on just about any distro even if it's not the default. It has a wealth of useful features, lets me tweak or disable them as I see fit, and avoids trying to turn my desktop into a mobile phone interface. (I used Xfce in the past, but its adoption of Gtk 3 transformed it into something that I found frustrating.)
Does anyone know why, when Lennart and friends wrote their XDG Base Directory Specification, they decided that each user should replicate /usr/local/ subdirectories under $HOME/.local/?
Doesn't being under $HOME make .local redundant? I guess one could argue for binaries going in an architecture-specific subdirectory if $HOME was on a shared filesystem, but that's not what's being done here.
To me, $HOME/.local/share and its siblings are just a needless level of indirection, forcing me to jump through an extra hoop every time I want to access what's in there.
(I know it's sometimes possible to override it with an environment variable, but the predictably spotty support for those overrides means I would then have to look for things in two places. I think sensible defaults would be nicer.)
> Does anyone know why, when Lennart and friends wrote their XDG Base Directory Specification,
It is Microsoft thing. You must pollute the user's /home as much as you can.
Can i say that i have 3 daemons on my computer respobsible for ... credentials.
This is the way to go.
Dunno the historical reason but I sure as heck find it nice to know without ambiguity that the folder called "share" corresponds to that special directory and isn't a random folder in my home directory for files that were intended to be e.g. shared with someone.
That doesn't align with their choice of $HOME/.cache (to which users need to navigate much less frequently than $HOME/.local/share), nor with how few items $HOME/.local typically saves from landing in $HOME, nor with the normally hidden state of everything starting with a dot.
So if that was their reasoning, it reinforces my view that they didn't think their design through very well.
reply