Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How is Computer Modern much worse than, say, Times New Roman?


Computer Modern has a very high contrast of thick and thin strokes, and the axis of the strokes is completely vertical. This makes it very difficult to render on bitmap displays: you can end up with 3-pixel lines next to 1-pixel lines, and the optical compensations of antialiasing can't fix the visual appearance because most of the lines are orthogonal.

These characteristics of the font are intentional: it's really a showcase for Knuth's Metafont system which lets you tweak stroke contrast to a high level, so you can create many weight and casing variations of Computer Modern parametrically. And it does look bearable in print. But it's really the worst possible screen font.

Times is much better in this respect because it was designed for readability long before computers, for newspaper printing where unexpected ink bleed and other artifacts were daily routine.

(Cynically, it's possible to view Computer Modern as a fitting portrait of computer science itself: the font is used because it's exciting to have 62 tweakable parameters in a system designed by someone famous in the field, not because it's any good for the end users.)


Okay, so you're not saying it's always hideous; you're saying it's hideous on screen. I think you're wrong even there: some of the screens that have come out in the last few years have resolutions near that of cheap inkjet printers, and on those screens Computer Modern looks okay. If you don't have access to such screens, there is a simple trick you can use instead: just set your zoom to something like 10x. The font looks glorious then, in my opinion. The one tiny problem is that nothing fits on the screen :)


I suspect his problem is more with Windows and its brain-dead font rendering. Vertical strokes are the ideal case for subpixel anti-aliasing on a typical LCD: you get 3x the horizontal resolution. On a typical 100dpi screen this gives you 300dpi horizontal resolution, which is around the low-end for print quality. On modern 200dpi "retina" displays, you get 600dpi horizontal resolution which is more than enough to make Computer Modern look good. But ClearType has a strong preference for aligning positions and sizes to whole pixels, giving you horrendous kerning and sudden large jumps in font size and apparent weight as you increase the requested font size.


I'm using a Retina MacBook. The readability problem with Computer Modern is intrinsic to its design, and even a 200 dpi screen just isn't enough to make it look good (IMHO). But it's definitely much worse still on Windows.

My question is: why use this particular 1980s font developed to showcase a dead-end stroke rendering technology, when there are much better alternatives around?


Why use it? It's the default, and otherwise you have to find a math font and text font that go together all on your own. At least that's why I use it: laziness.

I agree at 200dpi is not quite enough for Computer Modern. But I claim that, say, 7200dpi is far more than enough, and then it looks great.


Times New Roman often means that you created the document with an old version of Microsoft Word (before Calibri became the default) and never bothered to customize anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: