Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here you are right that there is a defect in the ancient X Window System, because it has only one global DPI value, instead of one DPI value per each attached monitor.

Correcting this is a very small change that would have been much simpler than inventing the various "integer scaling" and "fractional scaling" gimmicks, which have been included in some desktop environments.

Using the correct units in graphics APIs is not "programming font rendering". It would have been better if pixels would have never been exposed in any graphics APIs after the introduction of scalable font rendering and scalable drawing, removing thus any future scaling problems, but it was tempting to provide them to enable optimizations, especially during times when many were still using very low resolution VGA displays.

However such optimizations are typically useless, because they optimize an application only for the display that happens to be used by the developer, not for the display of the final user. Optimizations for the latter can be achieved only by allowing the users to modify any sizes, to be able to choose those that look best on their hardware.



Even if there were no way to control the output on a pixel level, you could easily be left with minecraft-like blocks -- there is not much else your high-DPI monitor can do with a client that simply don't output higher resolutions. E.g. if they are using a bitmap icon, that will still be ugly. (sure, they should use vector icons, but what about an image viewer showing bitmaps?)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: