You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide. The result is the same on the 1.5x display, and on the 1.1x display it's worse, because now you're re-scaling the line that is already anti-aliased, blurring it twice.
If you get a 133px box, you are cutting off pixels at the bottom.
Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.
> You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide.
Pixel-perfect is not the standard. If you're rendering a PDF everything starts out in vector space and there's nothing special about 1x and 2x. It may very well be that 1.5x is the scale at which the PDF line becomes pixel aligned. You just don't know. But the app might know, and that's the point. Asking it to do 2x and then scaling by a factor it's not aware of is strictly worse than giving it all the information to make good decisions.
> If you get a 133px box, you are cutting off pixels at the bottom.
No you're not. You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does. Just like sway doesn't let you do a 133.333px window in 1x it shouldn't let you do one in 1.5x. But it does, so the whole desktop is not pixel aligned, not even the start of the windows.
> Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.
This is precisely what I do. Increase the font scale in GTK and the interface scale in Firefox. That this gives me better results than the current solution is evidence that it can be done better. Unfortunately it breaks down in mixed-DPI situations.
With the PDF you would see the same results using a filtered scaler. There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues. If you want to add this to a wayland implementation, I won't (and can't) stop you, but I think it's a bad idea. Just my experience from trying to write various scaled interfaces, it becomes impossible to ensure that anything is pixel aligned after you introduce a non-integer transform in the graph. When the top point in the graph is the screen, that limits a lot what you can do there.
>You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does.
To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.
> There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues.
I don't think I've been able to explain then. There's strictly less rounding than today this way, Because from the point of view of Wayland every window is at native resolution and it's up to the client to scale.
> To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.
I don't understand what you're saying. 133px is correct, just like 134px is correct. The way WMs and compositors work is they split the screen between windows at integer coordinates. That's how it works at 1x and there's no reason to not work like that at 1.5x. They're completely independent issues.
It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale. What you are describing is rounding, you can't do this without rounding. Worse, with that method the scale technically changes every time you resize the window. This is hard to explain in text on HN and I don't have reference images to illustrate this at the moment, sorry.
> It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale.
You won't get the same result as rendering the entire screen at the same scale because you're snapping windows to integer borders. But that's the whole point. The app however is scaling at 2/3 within that window box and that's actually what you want because it gives better results. There's nothing special about scaling the whole screen that you can't do by scaling each individual window after first snapping all windows to integer sizes.
Only if the app is broken. The rendering engine needs to be able to fill whatever pixel size at whatever scale. If you mean that you can no longer do 2x and scale down at the exact perfect scale then that's true, but only a limitation of that way of rendering. It's also broken for whole screen scaling, just less noticeable. Firefox and Chrome don't have that issue at all for example, and neither does anything that renders fonts and vectors. At a guess this is probably where all this came from. GTK decided integer scaling was all it was going to support and then everything else was decided as if all clients behaved like that. I doubt even that's a problem. The differences in scale from doing 2x and scale down per-app instead of per-screen are miniscule.
There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
If you follow the email I linked a while ago, this was mostly a decision in Weston, not really particularly related to GTK. But if you want to follow the conversation on where Gnome is at with this, see here: https://gitlab.gnome.org/GNOME/mutter/-/issues/478
The end comments there are what I'm getting at, the only reliable way to do this is to just bypass scaling completely and have something where the client says "never scale my buffer at all on any monitor." But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
> There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
This is just not true. There's nothing special about 1x and 2x when you're rendering a font or a vector. 1.5x may very well be the scale at which everything snaps into the pixel grid because that's how the pt->px conversion happened to play out at that given DPI. Thinking of 1x and 2x rendering as being special cases is exactly the problem here. And even for things where you want to pixel align if you leave that to the client some can make more intelligent decisions. There's nothing stopping a UI library snapping everything to the pixel grid at arbitrary scales by varying the spacing between elements slightly between steps. That's the whole point, the client can do better things in a lot of cases and while the discussion was about Weston the Wayland protocol has ended up with this limitation, which makes sense since Weston is the reference implementation. There's currently no way for a client to request the scale factor of the screen and render directly at that factor.
> But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
There's no problem with mixed-DPI. If you have a 1x screen and a 1.5x screen render at 1.5x and scale down in the lower screen when spanning two screens. When in one screen exclusively render directly at 1x or 1.5x.
The real serious issues are not really with fonts and vectors, it's when you have a surface within a surface. (In the context of a browser, think flash plugins, JS canvas and the like) Those have to be pixel sized, which means practically you are forced to 1x, 2x, 3x... It is unfortunately a special case that is incredibly common, and it has to be handled. Solving this by snapping everything to the pixel grid is another form of rounding that has its own set of artifacts with elements jittering around when the window gets resized, and it breaks down again if you try to have surfaces within those subsurfaces. (e.g. You decide to display a cached buffer within your flash app or JS canvas) These problems are exacerbated further when you have mixed DPI, it actually makes it much worse, and you still have one screen that is going to be scaled down and blurry.
At some point maybe I'll put together a post to illustrate this, I know it seems easy when you think about it but if you try to implement it you'll see it causes a huge number of problems. They are not unsolvable but the amount of clients that are going to care enough to solve them all is very small. (Fundamentally this cannot really be solved in a browser either for the reasons I described above)
If you want to work on this, please do it, but don't say I didn't warn you that it's a rabbit hole. I've actually done everything you're suggesting before in other projects, and honestly we've really only scratched the surface of why trying to do this kind of scaling is problematic.
I don't get the argument that browsers are an example of how hard this is when browsers are today already completely capable of doing the right thing as clients. And so does Windows by the way. What keeps me from having a fractionally scaled desktop is Wayland. Firefox already works, and so does Chrome.
If you get a 133px box, you are cutting off pixels at the bottom.
Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.