The GPU is the most expensive component in gaming PCs these days, thus it makes the least sense for it to be the hard-wired component as there is the most price diversity in options. I have definitely upgraded GPUs on several computers over the past decade and I'm very thankful I didn't have to rip out the entire motherboard (and detach/reattach everything) to do so.
It's only the cheap components without a wide variety of options that make sense to build in, like WiFi, USB, and Ethernet.
Note this whole discussion is in the context of the 4090. If you're an enthusiast, soldering the GPU to the mobo forces you to spend $200-$700 more every time you upgrade your GPU because you also have to buy a new mobo and possibly a new CPU if the socket changed.
The GPU is also one of the easiest components to swap today. That's not something I want to give up unless I see performance improvements. Cooling doesn't count because I already have an external radiator attached to my open-frame "case".
I went through 3 GPUs before changing motherboards and I'm still bottlenecked on my 3090, not my 5800X3D. After I get a 4090, I expect to upgrade it at least once before getting a new mobo.
Having had a few GPUs go bad on me over the years, I would hate to have to disassemble the entire thing to extricate the mobo/gpu combo for RMA'ing, rather than just removing the GPU and sending it off.
The main reason is that CPUs get old much slower than GPUs, but “current” socket generations change quickly. Another reason is a combinatorial explosion of mobo x graphics card features, which is already hard.
Gpus get old slowly too, but youtube influencers convince gamers their current card isn’t enough often ans effectively (I’m not excluded from this group)
Then you lose modularity, which is a huge benefit to PC gaming? Now if you want to upgrade to the newest graphics card, you also need to get a new motherboard. Which also could mean you need a new CPU, which also could mean you need new RAM.
Right now you can just switch the graphics card out for another and keep everything else the same.
This is already happening (as was noted in other comments on this article).
One of the most prominent examples is the entire Apple Silicon lineup, which has the GPU integrated as part of the SoC, and is powerful enough to drive relatively recent games. (No idea just what its limits are, but I'm quite happy with the games my M1 Max can play.)
With mini cube PCs growing in popularity the future will probably be this, a mini PC with every part a USB type modularity for any ram or GPU or hd into that stock cube PC.
Just star designing motherboards with integrated GPUs of sufficient size.
Case closed.