Hacker News new | past | comments | ask | show | jobs | submit login

> just throw output eDP1-1 scale 2 in the config and you're set

Why? What? Why? The system exactly knows the DPI of all attached displays, why does it still need explicit configuration in 2020 to support HiDPI?!




How would it know what scaling factor you want?


It should start with a reasonable default and then give the user the choice of different scaling factors like macOS does: https://media.discordapp.net/attachments/380570311227342859/...


This is sway, i3 on wayland, a tilling window manager. Not some commercial piece of software that has to target the lowest common denominator to survive in the market. It's meant so that you configure it and it's very configurable. So it doesn't any defaults bar what's needed to launch the manager. The rest is up to you.


Never mind this,

Sway already detects hidpi displays based on a heuristic from EDID info and chooses an appropriate scale factor.

https://github.com/swaywm/sway/issues/1800

8, now 9, comments in a subthread because no one could bother to validate assumptions or do a Google search.


Configurability is great, but that isn't an excuse for having poor defaults. A child comment to yours mentions that Sway now uses a heuristic to find a "reasonable" default scaling factor, so it appears that Sway currently does the right thing.


The standard has been ~100 DPI, so a 200 DPI display should by default get 200 % scaling with no extra configuration needed.


The "standard" for anything today should be the physical display size divided by the pixel count to give the exact resolution for X and Y in DPI. Scaling should be relative to that physically correct resolution.

We've inherited a lot of baggage and odd conventions, some of which were wrong to begin with. I don't think we should be carrying on with it if we can do better. Having these scaling factors directly correspond to physical reality would be a start.


I'd rather just have the configuration file be simple and well documented and let me make the decision. My monitor running at maximum resolution is about 163 DPI, so an automated system could guess both ways. 200% scaling works well for me, and it's a single line of configuration, done once, and I don't have to worry about heuristics changing behind my back.


Maybe my monitor it’s close by and I have good eyes. Or it’s further away. The system can’t know.


Why do you think it needs any of that? For that matter why do you think windows, mac, android, chromeOS etc. all need "explicit configuration"? They all simply set a reasonable default scale and then give you an easy way to pick a different one if you want.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: