Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You lost me right here on line 1.

I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.

> You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.

I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!



>I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!

Are you sure you have 20/20 vision? I can absolutely resolve individual pixels with zero effort whatsoever on 1080p 27-inch displays.

Back when I had a 27-inch 1080p display at work, my MacBook's 13-inch Retina Display effectively became my main monitor. The 27-inch monitor was relegated to displaying documentation and secondary content, because I found its low resolution totally eye straining

Edit: I might have found it so eye straining because MacOS does not support sub pixel rendering. That means a lot of people will need a 4K or Retina monitor to have a comfortable viewing experience on the Mac.


MacOS does support subpixel rendering, has at least since the early to mid 2000s. One or two versions back though they turned it off by default since it isn't necessary on HiDPI "Retina" displays and they only ship HiDPI displays now.

You can still turn it on although it requires the command line.


Subpixel rendering dramatically slows down rendering text. When you have a high res screen, and want everything to be 120fps, even text rendering starts to be a bottleneck.

That combined with the fairly massive software complexity of subpixel rendering is probably why mac dropped it.


Been a while since my eyesight tested, but I think so! I can see pixels if I focus but not when reading text at any speed. I have also checked and my display is only 24" (could've sworn it was more!) so maybe that's why. I retract my comment :)


> I think the point the parent is making is that human vision has limited resolution.

If you can't see the difference between 4k and 1080p on a 24" monitor, then you probably need reading glasses. On a 27" monitor it's even worse. It's not so much that you can "see" the pixels, sub pixel rendering and anti-aliasing go a long way to making the actual blocky pixels go away, the difference is crisp letters versus blurry ones.


On Windows blurry letters depend on the font: some fonts have blurry letters, others have crisp letters.


Yes, I can see the difference, but I (personally) don't notice that difference while reading. I do notice a big difference when using older monitors with lower DPI compared with 1080p on a normal-sized desk monitor, however.


> then you probably need reading glasses

Maybe, just maybe, one could talk about how crisp text appears on 4k _without_ being rude.


As a glasses-wearer, this does not seem rude to me.


> I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.

Haven't seen any scaling issues on Windows in years. Last time was Inkscape but they fixed that.


I see these issues all the time, with enterprise desktop apps. The scaling is only really a problem because it is enabled by default when you plug in certain displays. If the user made a conscious choice (which they would easily remember if they had trouble), it would be fine.


For many, many years there were at the very most 120 dpi monitors, with almost all being 96, and I imagine a lot of enterprise applications have those two values (maybe 72 as well) hard-coded and don't behave properly with anything else.

I know my company's ones do.


I'm currently working from home, accessing my Windows 10 desktop machine in the office via Microsoft's own Remote Desktop over a VPN connection. This works fine on my old 1920x1280 17" laptop, but connecting from my new 4k 15" laptop runs into quite a few edge cases, and plugging an external non-4k monitor has led to at least two unworkable situations.

I've now reverted to RDP-ing from my old laptop, and using the newer one for video calls, scrum boards, Spotify and other stuff that doesn't require a VPN connection or access to my dev machine. It mostly works OK in that configuration.

I've seen other weird things happen when using other Terminal Services clients, though.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: