Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.

Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering - with fonts being a blurry mess. You can only really use MacOS with high DPI monitors now for all day working. It’s a huge problem for everyone I know who wants to plug their MacBook into a normal DPI display. Not that the subpixel/hinting was ever that good - Linux has always had much better font rendering in my opinion across a wider range of displays.



> Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering

Nonsense, fonts look fine on non-Retina monitors; they were fine on my old 24" 1920x1200 monitor and are fine on my new 27" 2560x1440 one. Can I see a difference if I drag window from the external monitor to the built-in Retina display? Yes, but text is not blurry at all on the external monitor.

If it matters, "Use font smoothing when available" is checked in System Preferences (which only appears to have an effect on the Retina display, not the monitor).


That's been my experience, too. I prefer high-DPI monitors, but back when I was going into the office (remember going into the office?) and connecting my MacBook to a 1920x1200 monitor, text was perfectly readable. I suppose if I had two low-DPI Macs, one running Catalina and one running, I don't know, High Sierra, I might be able to tell the difference at smaller font sizes,

As an aside, I wonder whether the article's explanation of how font hinting works -- I confess for all these years I didn't know the point of "hinting" was to make sure that fonts lined up with a rasterized grid! -- explains why I always found fonts to look a little worse on Windows than MacOS. Not less legible -- arguably hinted fonts are less "fuzzy" than non-hinted fonts on lower-resolution screens, which (I presume) is what people who prefer hinted fonts prefer about them -- but just a little off at smaller sizes. The answer is because they literally are off at smaller sizes.


These things are fairly subjective. But it’s hard to argue that Catalina has good font rendering on regular DPI screens. I dealt with it when I had to, but it was very poor. There are also tons of bugs around it. Like the chroma issue - Apple doesn’t support EDID correctly so fonts look even more terrible on some screens. A google search will confirm these problems.


This is an interesting position. I have always thought that font and font rendering were always an especially pernicious issue with Linux and a relative joy on MacOS?


I think that is a historical artifact. Ubuntu had a set of patches for freetype called infinality, developed around mid 2010, which dramatically improved font rendering. Since then, most of those improvements have been adopted and improved in upstream. [1] Any relatively modern Linux desktop should have very good font rendering.

[1]: https://www.freetype.org/freetype2/docs/subpixel-hinting.htm...


Except for the hidpi inconsistency across apps and window managers


As with most things Apple, it is a joy as long as you restrict yourself to only plugging the device into official Apple peripherals, preferably ones that are available to buy right now. It’s when you start hooking your Mac up to old hardware or random commodity hardware that the problems surface.


I recently started using Linux some on the same 4K monitor I usually have my Mac connected to. I was shocked at how much sharper and easier to read the text was on Linux.


It's pretty straight forward re-enabling sub pixel rendering.


I actually agree that most modern Linux DEs has better rendering. Especially since you can configure how much hinting and aliasing you want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: