> It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale.
You won't get the same result as rendering the entire screen at the same scale because you're snapping windows to integer borders. But that's the whole point. The app however is scaling at 2/3 within that window box and that's actually what you want because it gives better results. There's nothing special about scaling the whole screen that you can't do by scaling each individual window after first snapping all windows to integer sizes.
Only if the app is broken. The rendering engine needs to be able to fill whatever pixel size at whatever scale. If you mean that you can no longer do 2x and scale down at the exact perfect scale then that's true, but only a limitation of that way of rendering. It's also broken for whole screen scaling, just less noticeable. Firefox and Chrome don't have that issue at all for example, and neither does anything that renders fonts and vectors. At a guess this is probably where all this came from. GTK decided integer scaling was all it was going to support and then everything else was decided as if all clients behaved like that. I doubt even that's a problem. The differences in scale from doing 2x and scale down per-app instead of per-screen are miniscule.
There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
If you follow the email I linked a while ago, this was mostly a decision in Weston, not really particularly related to GTK. But if you want to follow the conversation on where Gnome is at with this, see here: https://gitlab.gnome.org/GNOME/mutter/-/issues/478
The end comments there are what I'm getting at, the only reliable way to do this is to just bypass scaling completely and have something where the client says "never scale my buffer at all on any monitor." But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
> There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
This is just not true. There's nothing special about 1x and 2x when you're rendering a font or a vector. 1.5x may very well be the scale at which everything snaps into the pixel grid because that's how the pt->px conversion happened to play out at that given DPI. Thinking of 1x and 2x rendering as being special cases is exactly the problem here. And even for things where you want to pixel align if you leave that to the client some can make more intelligent decisions. There's nothing stopping a UI library snapping everything to the pixel grid at arbitrary scales by varying the spacing between elements slightly between steps. That's the whole point, the client can do better things in a lot of cases and while the discussion was about Weston the Wayland protocol has ended up with this limitation, which makes sense since Weston is the reference implementation. There's currently no way for a client to request the scale factor of the screen and render directly at that factor.
> But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
There's no problem with mixed-DPI. If you have a 1x screen and a 1.5x screen render at 1.5x and scale down in the lower screen when spanning two screens. When in one screen exclusively render directly at 1x or 1.5x.
The real serious issues are not really with fonts and vectors, it's when you have a surface within a surface. (In the context of a browser, think flash plugins, JS canvas and the like) Those have to be pixel sized, which means practically you are forced to 1x, 2x, 3x... It is unfortunately a special case that is incredibly common, and it has to be handled. Solving this by snapping everything to the pixel grid is another form of rounding that has its own set of artifacts with elements jittering around when the window gets resized, and it breaks down again if you try to have surfaces within those subsurfaces. (e.g. You decide to display a cached buffer within your flash app or JS canvas) These problems are exacerbated further when you have mixed DPI, it actually makes it much worse, and you still have one screen that is going to be scaled down and blurry.
At some point maybe I'll put together a post to illustrate this, I know it seems easy when you think about it but if you try to implement it you'll see it causes a huge number of problems. They are not unsolvable but the amount of clients that are going to care enough to solve them all is very small. (Fundamentally this cannot really be solved in a browser either for the reasons I described above)
If you want to work on this, please do it, but don't say I didn't warn you that it's a rabbit hole. I've actually done everything you're suggesting before in other projects, and honestly we've really only scratched the surface of why trying to do this kind of scaling is problematic.
I don't get the argument that browsers are an example of how hard this is when browsers are today already completely capable of doing the right thing as clients. And so does Windows by the way. What keeps me from having a fractionally scaled desktop is Wayland. Firefox already works, and so does Chrome.
You won't get the same result as rendering the entire screen at the same scale because you're snapping windows to integer borders. But that's the whole point. The app however is scaling at 2/3 within that window box and that's actually what you want because it gives better results. There's nothing special about scaling the whole screen that you can't do by scaling each individual window after first snapping all windows to integer sizes.