Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd probably agree with the "useful" but I find higher resolution more aesthetically pleasing, especially text.


Conversely, I find anything above 1920x1080 very displeasing precisely because it removes my ability to practically use bitmapped fonts. Subpixel antialiasing is very distracting and Retina (IMHO) is a solution in search of a problem when it comes to making user interfaces that are actually aesthetic and easy on the eyes. I'm autistic and have diagnosed vision problems tho, so that probably feeds into it for better or worse.


Fair enough and to each their own. I've been using computers since bitmapped fonts on 320x200 screens were the norm, and I've always been excited to upgrade resolution.


I too think text looks nicer at higher dpi but I had bad experiences with fractional scaling on Linux in the past, and 4K monitors are more expensive, so I didn't bother getting one.


I always see with surprise these claims about the so-called "fractional scaling", which is something I have never encountered on Linux.

This "fractional scaling" might be a problem of Wayland and/or Gnome, but it certainly it is not a problem of Linux or of X Window System.

In any non-stupid graphics environment you need just to set an appropriate value for the dots-per-inch parameter, which will inform all applications about the physical size of the pixels on your monitor (allowing the rendering algorithms to scale arbitrarily any graphic elements).

Any non-stupid application must specify the size of the fonts in typographic points, not in pixels. When this is done, the fonts will be rendered at the same size on any monitor, but beautifully on a 4k monitor and uglier on a Full HD monitor.

The resolution of a Full HD monitor is extremely low in comparison with printed paper, so the fonts rendered on it are greatly distorted in comparison with their true outlines. A 4k monitor is much better, but at normal desktop sizes it is still inferior to printed paper, so for big monitors even better resolutions are needed to recreate the same experience that has been available for hundreds of years when reading printed books. A 4k monitor can match the retina resolution only for very small screens or for desktop monitors seen from a great distance, much greater than a normal work distance.

Similarly, any non-stupid drawing applications must not specify any dimensions in pixels, but in proper length units or in units relative to the dimensions of the screen or of the windows, and then the sizes will be the same everywhere, but all graphical elements will be more beautiful on a 4k monitor.

This was already elementary knowledge more than 30 years ago, and recommended since the most ancient versions of X Window System and MS Windows. I do not even know when this modern "fractional scaling" junk problem has appeared and who is guilty of it.

I have switched to using only 4k monitors with my desktops and laptops, on all of which I use Linux (with XFCE), about a decade ago, and during all this time I never had any kind of scaling problems, except with several professional (!!) applications written in Java by incompetent programmers, which not only ignore the system settings, so they show pixel-sized windows and fonts, but they also do not have any option for choosing another font or at least another font size (so much for the "run anywhere" claim of Java).


Now try 2 screens with different pixel densities. Also, it is pretty dumb to call out apps like that — popular frameworks either support that workflow or not. I should not be programming font rendering in my todo list app, that is outside the scope of such a project.


Here you are right that there is a defect in the ancient X Window System, because it has only one global DPI value, instead of one DPI value per each attached monitor.

Correcting this is a very small change that would have been much simpler than inventing the various "integer scaling" and "fractional scaling" gimmicks, which have been included in some desktop environments.

Using the correct units in graphics APIs is not "programming font rendering". It would have been better if pixels would have never been exposed in any graphics APIs after the introduction of scalable font rendering and scalable drawing, removing thus any future scaling problems, but it was tempting to provide them to enable optimizations, especially during times when many were still using very low resolution VGA displays.

However such optimizations are typically useless, because they optimize an application only for the display that happens to be used by the developer, not for the display of the final user. Optimizations for the latter can be achieved only by allowing the users to modify any sizes, to be able to choose those that look best on their hardware.


Even if there were no way to control the output on a pixel level, you could easily be left with minecraft-like blocks -- there is not much else your high-DPI monitor can do with a client that simply don't output higher resolutions. E.g. if they are using a bitmap icon, that will still be ugly. (sure, they should use vector icons, but what about an image viewer showing bitmaps?)


It's too bad that displays designed for 2x UI scaling are so rare outside of Apple stuff. Even on Windows which is probably the OS with the best fractional UI scaling, 2x looks visibly better.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: