Wayland just punts on the issue by removing the compositor from any decision involving scaling. The "Wayland Way" is for the compoisitor to relay information (monitor-reported scale, etc) to the client, and have the client figure out how to scale itself correctly.
Unfortunately this means that these HiDPI bugs have to be fixed over and over in myriad apps and toolkits, and software (cough cough chromium) that is so complicated that it cannot use a toolkit.
Some compositors like sway will let you do in-compositor scaling, but this is admittedly a kludge: it produces blurry scaled-up output instead of rendering at the right resolution in the first place. The sway authors acknowledge this; the feature is only there as a "better than nothing" workaround.
It would actually be great if Wayland would allow clients to handle HiDPI themselves but it doesn't which makes fractional scaling much worse than it needs to be. Wayland defines integer scaling only and then compositors scale 2x content down to 1.5x or whatever it is you want. This means that if you have an app that's capable of drawing at arbitrary scale you are still forced to draw at 2x and then be scaled down by the compositor. This is needlessly blurry and a performance penalty. Text can be drawn at arbitrary scale but is blurry because of this. Browsers can render at any resolution but have substantial performace penalties because of this (2x at common fractional scales). Anything that renders images also gets needless downscales because of this with both quality and performance penalties. It's a baffling decision to not do this client side and hopefully will be fixed with the evolution of the protocol.
This was discussed at length years ago and it was decided that passing a fractional scale value to the client is not what you want, makes the code much more complicated and doesn't really solve the blurriness. A bit of background here: https://lists.freedesktop.org/archives/wayland-devel/2013-Ma...
To illustrate, I usually use a simple example: How do you render a 20px tall font at scale 1 2/3 without being blurry or without rounding errors? And if you wanted to avoid any additional scaling artifacts or rounding errors, how tall would you make the output buffer, in pixels? What happens when you try to stretch this window across multiple screens?
>To illustrate, I usually use a simple example: How do you render a 20px tall font at scale 1 2/3?
You replace it with a 33px tall font.
This is what Gnome 3.38 actually does in my experience: on the computer on which I am writing this, I have a scale factor of 1.50 set in the Displays pane of Gnome Settings. Google Chrome and vscode show up in the output of xlsclients, which I am told means that they are talking to XWayland (and there are web pages promising that in a few months those 2 apps will be adapted to talk directly to wayland), and those 2 apps are blurry whereas Gnome Terminal and Evince (my PDF reader) do not show up in the output of xlsclients and those 2 apps are not blurry, but their default text size and the size of their UI elements are consistent with everything being scaled by 1.5.
Yeah, well, what Gnome does is if you ask it for 150% scaling, then it actually gives you 147% scaling or 152% scaling to make the number come out even.
Don't know about how Gnome does that[1], but Apple certainly does that, and it allows them to limit compounding of the pixel errors to a group of 8 or 9 pixels. Each new pixel group starts without any compounded error.
[1] Last time I checked, they didn't scale entire framebuffer as Apple does it, with output scaler. It looked like they were scaling surfaces individually with GPU and then composed the scaled surfaces into framebuffer with the same dimensions as the display resolution.
> This was discussed at length years ago and it was decided that passing a fractional scale value to the client is not what you want, makes the code much more complicated and doesn't really solve the blurriness.
From your link:
"""
> While rendering at rational factor can't be of better quality, it can
> however be done at a much higher quality than downscaling, faster and with
> less memory usage.
Perhaps, but that benefit diminishes compared to the cons.
"""
Being blurry and losing performance was deemed acceptable for other benefits. Maybe I'm discounting those benefits but the penalties are very real. 2x performance penalties in performance sensitive code and blurry scalable content like fonts are real downsides.
Looking further down the thread what I think would be the ideal solution is what is being proposed and doesn't get any response:
The current solution actually seems much more complex than this as the protocol needs to actually know about scaling instead of the scaling factor being just some metadata about how things should be and have been rendered and everything else would be within the client and compositor code.
> How do you render a 20px tall font at scale 1 2/3 without being blurry or without rounding errors?
Fonts are not sized in pixels for the most part. Modern fonts are resolution independent and the renderers try hard to fit it to however many pixels you have. By faking them out and telling them "here's a 200x200 box to do your work" and then later scaling down that box to 150x150 prevents that code from working properly.
> What happens when you try to stretch this window across multiple screens?
For this case your render at the maximum of the two fractional scales and scale down in the lowest one. It's no worse than the current solution for that screen and better for the one that matches.
The solution in that email is essentially how it works, the only difference is that the scaling factor is forced to be an integer.
>By faking them out and telling them "here's a 200x200 box to do your work" and then later scaling down that box to 150x150 prevents that code from working properly.
Then question I was getting at is: how do you write a renderer that does the right thing and works properly when given a box of 133.333333 x 133.333333 pixels? If you say "round in some direction" this is now a policy that must propagate to every bit in the stack that touches input and rendering.
>It's no worse than the current solution for that screen and better for the one that matches.
It actually is much worse for the one that doesn't match, because now you have lost the ability to do pixel-perfect scaling.
It's not how it works at all for fractional scaling, which is the point. And you don't give it a 133.33 box. That's what you're in effect doing now by having the compositor scale the whole screen. You give it an actual 133px box because since you're working at native resolution apps get snapped to pixel boundaries.
> It actually is much worse for the one that doesn't match, because now you have lost the ability to do pixel-perfect scaling.
I don't know what you mean by this. Pixel perfect scaling is precisely what's not possible right now. If you have a 1.3x screen and a 1.5x screen you can't render 1:1 to any of them. With this you could at least have 1:1 in the 1.5x screen and be pixel perfect there.
You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide. The result is the same on the 1.5x display, and on the 1.1x display it's worse, because now you're re-scaling the line that is already anti-aliased, blurring it twice.
If you get a 133px box, you are cutting off pixels at the bottom.
Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.
> You can't be pixel perfect in the 1.5x screen if your renderer is scaling correctly and thus is rendering lines at 1.5 pixels wide.
Pixel-perfect is not the standard. If you're rendering a PDF everything starts out in vector space and there's nothing special about 1x and 2x. It may very well be that 1.5x is the scale at which the PDF line becomes pixel aligned. You just don't know. But the app might know, and that's the point. Asking it to do 2x and then scaling by a factor it's not aware of is strictly worse than giving it all the information to make good decisions.
> If you get a 133px box, you are cutting off pixels at the bottom.
No you're not. You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does. Just like sway doesn't let you do a 133.333px window in 1x it shouldn't let you do one in 1.5x. But it does, so the whole desktop is not pixel aligned, not even the start of the windows.
> Edit: What you are asking is possible, right now. Just leave your primary display at 1x scale, change the secondary to 11/15 scale, and then turn up the sizing in your applications. It's actually better that nothing is needed in the Wayland protocol to do that.
This is precisely what I do. Increase the font scale in GTK and the interface scale in Firefox. That this gives me better results than the current solution is evidence that it can be done better. Unfortunately it breaks down in mixed-DPI situations.
With the PDF you would see the same results using a filtered scaler. There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues. If you want to add this to a wayland implementation, I won't (and can't) stop you, but I think it's a bad idea. Just my experience from trying to write various scaled interfaces, it becomes impossible to ensure that anything is pixel aligned after you introduce a non-integer transform in the graph. When the top point in the graph is the screen, that limits a lot what you can do there.
>You're just sizing the window to integer pixel sizes. Just like every X WM and Wayland compositor does.
To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.
> There is unfortunately no way to do this that doesn't break down in mixed-DPI situations because of the rounding issues.
I don't think I've been able to explain then. There's strictly less rounding than today this way, Because from the point of view of Wayland every window is at native resolution and it's up to the client to scale.
> To be clear, this is if you configured your output scaling to be 2/3. 133px is absolutely not correct there.
I don't understand what you're saying. 133px is correct, just like 134px is correct. The way WMs and compositors work is they split the screen between windows at integer coordinates. That's how it works at 1x and there's no reason to not work like that at 1.5x. They're completely independent issues.
It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale. What you are describing is rounding, you can't do this without rounding. Worse, with that method the scale technically changes every time you resize the window. This is hard to explain in text on HN and I don't have reference images to illustrate this at the moment, sorry.
> It's not correct because you are not rendering at 2/3 scale anymore, you are rendering at 0.665 or 0.67 scale.
You won't get the same result as rendering the entire screen at the same scale because you're snapping windows to integer borders. But that's the whole point. The app however is scaling at 2/3 within that window box and that's actually what you want because it gives better results. There's nothing special about scaling the whole screen that you can't do by scaling each individual window after first snapping all windows to integer sizes.
Only if the app is broken. The rendering engine needs to be able to fill whatever pixel size at whatever scale. If you mean that you can no longer do 2x and scale down at the exact perfect scale then that's true, but only a limitation of that way of rendering. It's also broken for whole screen scaling, just less noticeable. Firefox and Chrome don't have that issue at all for example, and neither does anything that renders fonts and vectors. At a guess this is probably where all this came from. GTK decided integer scaling was all it was going to support and then everything else was decided as if all clients behaved like that. I doubt even that's a problem. The differences in scale from doing 2x and scale down per-app instead of per-screen are miniscule.
There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
If you follow the email I linked a while ago, this was mostly a decision in Weston, not really particularly related to GTK. But if you want to follow the conversation on where Gnome is at with this, see here: https://gitlab.gnome.org/GNOME/mutter/-/issues/478
The end comments there are what I'm getting at, the only reliable way to do this is to just bypass scaling completely and have something where the client says "never scale my buffer at all on any monitor." But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
> There is no "non-broken" way that an app can consistently render at a real number scale into an integer sized buffer. Everything is going to have rounding or blurring in some way. This is unavoidable.
This is just not true. There's nothing special about 1x and 2x when you're rendering a font or a vector. 1.5x may very well be the scale at which everything snaps into the pixel grid because that's how the pt->px conversion happened to play out at that given DPI. Thinking of 1x and 2x rendering as being special cases is exactly the problem here. And even for things where you want to pixel align if you leave that to the client some can make more intelligent decisions. There's nothing stopping a UI library snapping everything to the pixel grid at arbitrary scales by varying the spacing between elements slightly between steps. That's the whole point, the client can do better things in a lot of cases and while the discussion was about Weston the Wayland protocol has ended up with this limitation, which makes sense since Weston is the reference implementation. There's currently no way for a client to request the scale factor of the screen and render directly at that factor.
> But of course that has the same issues with mixed DPI and you would only really want to use it for performance-sensitive applications that are not particularly sensitive to scale.
There's no problem with mixed-DPI. If you have a 1x screen and a 1.5x screen render at 1.5x and scale down in the lower screen when spanning two screens. When in one screen exclusively render directly at 1x or 1.5x.
The real serious issues are not really with fonts and vectors, it's when you have a surface within a surface. (In the context of a browser, think flash plugins, JS canvas and the like) Those have to be pixel sized, which means practically you are forced to 1x, 2x, 3x... It is unfortunately a special case that is incredibly common, and it has to be handled. Solving this by snapping everything to the pixel grid is another form of rounding that has its own set of artifacts with elements jittering around when the window gets resized, and it breaks down again if you try to have surfaces within those subsurfaces. (e.g. You decide to display a cached buffer within your flash app or JS canvas) These problems are exacerbated further when you have mixed DPI, it actually makes it much worse, and you still have one screen that is going to be scaled down and blurry.
At some point maybe I'll put together a post to illustrate this, I know it seems easy when you think about it but if you try to implement it you'll see it causes a huge number of problems. They are not unsolvable but the amount of clients that are going to care enough to solve them all is very small. (Fundamentally this cannot really be solved in a browser either for the reasons I described above)
If you want to work on this, please do it, but don't say I didn't warn you that it's a rabbit hole. I've actually done everything you're suggesting before in other projects, and honestly we've really only scratched the surface of why trying to do this kind of scaling is problematic.
I don't get the argument that browsers are an example of how hard this is when browsers are today already completely capable of doing the right thing as clients. And so does Windows by the way. What keeps me from having a fractionally scaled desktop is Wayland. Firefox already works, and so does Chrome.
You already have to propagate code all over to handle integer multiples, don't you?
The email seems to be worried about things happening at the wayland level with fractions, but that's easily solved by rounding entire windows to the nearest pixel.
You do have to propagate the scaling factor, but as long as it's an integer scale, you don't have to convert all your coordinates to use floating-point everywhere and enforce rounding rules.
I'd also like to point out that if you are rounding in your rendering anywhere, you are now likely either introducing blurriness again, or something is getting scaled incorrectly. The only real way I know to do this without causing artifacts or error build-up is to have to compositor only do the rounding as the last step, which is the way it works now, and you have to use an integer scale for that. As you describe, this is the "easily solved" way, but it's only really easy because the compositor gets to handle it internally -- once you push that problem to clients it gets significantly harder.
Quite the opposite. Once you push the problem to the client a lot of them already know how to do it and by not doing two scaling steps you can make much better decisions. And you don't need coordinates as floating point at all. What you need to do is tell the app "you have a 150x150 window on which to display content at 1.33x scale, just give me back a buffer with those properties". Wayland can work at integer coordinates all the way. The app also draws at integer coordinates it just uses the scaling factor to know what to do. It can even just replicate the 2x and then scale down solution if it doesn't know how to do anything better but plenty of code does know how to do something better and much faster. Browsers, image viewers, 3D renderers, PDF viewers, are all natively able to scale arbitrarily and yet are forced by wayland to draw at 2x and be scaled down.
They aren't forced to draw at 2x and be scaled down. They only need to do that if you have enabled fractional scaling. If you really want to avoid the overdraw associated with that, just don't use any scaling on your primary display and then turn the font size in your client up. Anything else that passes the fractional scale onto the client is going to have issues. The reason you pass the buffer scale in wayland is because you explicitly want the compositor to scale your buffer and you want it to do it correctly. If you don't want that, don't bother with messing with fractional scaling.
>you have a 150x150 window on which to display content at 1.33x scale, just give me back a buffer with those properties
Again, this comes back to the issue where now you need to deal with rounding in the client. Only the simplest of browsers, image viewers, 3D renderers, PDF viewers that have no UI chrome could get away with just passing on a fractionally scaled buffer. And for those, you can do what I described above, disable fractional scaling and do your scaling in the client, you don't need any special support in the compositor. I believe in most applications you can press Ctrl-Plus and Ctrl-Minus :)
> They aren't forced to draw at 2x and be scaled down. They only need to do that if you have enabled fractional scaling.
Fractional scaling is exactly what we're talking about. At 1x/2x everything works.
> If you really want to avoid the overdraw associated with that, just don't use any scaling on your primary display and then turn the font size in your client up.
That's what I do but doesn't work with mixed-DPI. The fact that it works and improves things just shows how this could be done better.
> The reason you pass the buffer scale in wayland is because you explicitly want the compositor to scale your buffer and you want it to do it correctly. If you don't want that, don't bother with messing with fractional scaling.
That's circular. You're arguing that the way things are done is an argument for things to continue to be done like that.
> Only the simplest of browsers, image viewers, 3D renderers, PDF viewers that have no UI chrome could get away with just passing on a fractionally scaled buffer.
Both Firefox and Chrome have their whole UI scalable at arbitrary values and have for many years. See layout.css.devPixelsPerPx in Firefox for example. There's nothing special about the window where the Webpage/PDF/3D gets rendered versus the UI. Vector UIs that can scale arbitrarily not only exist they're widely used. And the worst case is still "if you don't know any better do 2x and scale down yourself client".
> And for those, you can do what I described above, disable fractional scaling and do your scaling in the client, you don't need any special support in the compositor. I believe in most applications you can press Ctrl-Plus and Ctrl-Minus :)
You can't disable fractional scaling on a per-program basis, it wouldn't work at all, and it would still be broken in a mixed DPI setting. I want the Browser/PDF/Image/3D windows on my 1440p screen to have different scaling from my 4K screen and for them to switch scaling automatically, just like they resize automatically, when I switch them between screens. That's currently not possible because there's no support for it in Wayland.
>Fractional scaling is exactly what we're talking about. ... That's circular. You're arguing that the way things are done is an argument for things to continue to be done like that.
Actually no, what you are describing is ideally having no scaling at all in the compositor on any particular monitor. (Unless of course a window is stretched between them) I'm not arguing for things to be done any way, I'm saying don't misuse a feature that was designed to do something else than what you're asking for.
For a vector UI that is supposed to be pixel-aligned, you can't render that at a non-integer scale without the same blurriness that you would get with doing the scaling in the compositor. That's different from rendering print content that has no relation to pixels.
> For a vector UI that is supposed to be pixel-aligned, you can't render that at a non-integer scale without the same blurriness that you would get with doing the scaling in the compositor. That's different from rendering print content that has no relation to pixels.
That's one case where the downside is the same, not even worse, and the cases where that's not the case are broken. Wayland pushes a lot of things to the client, so it's surprising that the client can't make this decision when at worst the result has the same problem. And there are a lot of clients that could take advantage of this today. The browsers we are using to have this discussion are made needlessly blurry because of this. They have the whole stack ready for fractional scaling themselves and are forced to 2x and scale down by the compositor.
> I'd also like to point out that if you are rounding in your rendering anywhere, you are now likely either introducing blurriness again, or something is getting scaled incorrectly.
Some things will be blurry, but no worse than having the compositor do it. Other things will not be blurry, and that's a huge huge benefit.
> artifacts or error build-up
Integer multiples risk artifacts too, if the program is actually supposed to be rendering in high resolution. Error build-up I'm not sure should be a big worry when you can have 53 bits of precision for a window a couple thousand pixels across.
> As you describe, this is the "easily solved" way
I'm not sure you understood what I meant. I was saying that when possible the compositor should round the window to the nearest pixel, then have the client render at that resolution, so then the compositor would not do any scaling at all.
> The "Wayland Way" is for the compoisitor to relay information (monitor-reported scale, etc) to the client, and have the client figure out how to scale itself correctly.
FWIW this is also the "Xorg Way" (RandR provides all the necessary information), it is just that clients do not bother.
Ideally the window manager would use the information provided by RandR to set up default scale levels for each monitor and then send notifications to the applications whenever the window changes monitor to alter their scale levels (this would also allow for custom per-window scale level that is applied on top of the per-monitor scale level in case someone wants to scale up/down a specific window - e.g. scaling up some notepad-like editor for a screencast).
This does require each client (and toolkit) to support arbitrary scaling... and window managers to agree on such a message, though i do remember an email about this topic being posted on Xorg mailing list some time ago. AFAIK Qt should already provide the necessary functionality for this.
> FWIW this is also the "Xorg Way" (RandR provides all the necessary information), it is just that clients do not bother.
One of the differences is, that Wayland surfaces do have scale property. So compositor knows, when clients do not bother, and can do scaling for them. Under X11, some clients do bother, some don't, and the compositor doesn't know which are which.
Right now there isn't any protocol for doing what i described, however this information could easily be added. EWM (or something like it) needs to be extended for scaling support. Applications could do the scaling themselves as they detect their toplevel windows be moved around, but i'm not aware of any application doing that and it'd be fighting the window manager so it isn't a good idea anyway.
But the X server already provides all the necessary information and mechanism for implementing this, it is the clients that need to use it: the window managers need to implement some way to inform toplevel windows that they need to change scale (e.g. via a custom message), applications need to inform the window manager and the compositor that they support such scaling messages (e.g. via a window attribute) and compositors need to be able to scale windows that do not support this scaling (this will probably need a special window manager <-> compositor protocol for compositors that are independent from the window manager while still allowing the window manager to handle scaling without having to also be a compositor).
> But the X server already provides all the necessary information and mechanism for implementing this,
X server provides that at the display level, not at the screen level. If you do that at display level, you cannot move windows between displays, the application has to destroy and recreate them (out of all applications, only emacs can do that). If you want to use then per screen, how it is done today, then screens must have identical density.
> the window managers need to implement some way to inform toplevel windows that they need to change scale (e.g. via a custom message), applications need to inform the window manager and the compositor that they support such scaling messages (e.g. via a window attribute) and compositors need to be able to scale windows that do not support this scaling (this will probably need a special window manager <-> compositor protocol for compositors that are independent from the window manager while still allowing the window manager to handle scaling without having to also be a compositor).
Defining the protocol would be the easier part; persuading application authors to support it would be the hard part. They would ignore it for years to come. And that's for applications that are maintained or can be updated.
So when you have to make changes, you might as well as change the protocol and adjust for achieving multiple additional objectives - which is exactly what Wayland did.
What you are describing is not enough, the X server or the X compositor would still need to scale windows up/down if they don't support scaling or if you want to have fractional scaling. It's not something a old-style window manager can do -- There is just no real "Xorg way" on how to do this right now.
This is only needed if a program doesn't support scaling itself and is something that a compositor can support. In addition there is a branch by Keith Packard for having server-side window scaling that could also be used for this and work independent of any window manager (though a compositor can also be made to work with any window manager - for example there are people using a compositor with Window Maker even though WM doesn't support composition).
I don't know anything about the wayland API. Does this imply that windows can't have a different scaling on each screen they may overlap on? Or can a window be split across different surfaces?
Wayland surfaces (windows) have integer property, that says at what scale they are rendered. Compositor then suitably scales them for target displays. If a surface is at two different displays at two different resolutions, it will be scaled correctly on both (up or down; depending on the application; in practice downscaled on the lower resolution one).