Hacker News new | past | comments | ask | show | jobs | submit login

a) I don't see how a piece of software which has essentially not changed for twenty years is a "good example of stability". Stability in the face of what? Real software occupies a constantly changing world and TeX is sheltered from that by the surrounding third-party tools. TeX's stability would only be impressive of it were changing or growing while still retaining it's original functionality. Anybody can refuse to update a program and declare it to be stable - there's no lesson for me as an engineer, beyond just refusing to write code.

b) I don't understand this claim that TeX is bug free - there have been 947 documented bugs in TeX, with 427 of them occurring after TeX82. There was at least at least one bug in TeX found every year until 1995 and over a dozen have been found since then - in a code base which hasn't changed. The number of bugs in TeX as a percentage of the number of lines doesn't seem to be unusually low, and given that very little development has been done in the past 26 years, there has been very little chance of introducing new bugs. Once again, I'm not sure what I'm supposed to be learning from TeX as an engineer other than refusing to write code.

c) I'm not talking about clojure, I'm talking about TeX. I agree that there are few examples of fast, high quality typesetting software. But TeX is still slow - it does so much file I/O because it assumes a tiny amount of RAM. A modern rewrite could be 10 times faster, I reckon.

d) By definition it's a language that almost nobody is familiar with - and that's a problem. After some careful study, I am now familiar with it, but it's still very hard to read a program which is structured to be read as a book. Once cannot skim read the code or get a good understanding of its fundamental structure as a program. Instead one is always trapped in Knuth's narrative. I don't want to read a book - I want to read a program, and web makes that practically impossible. The end result is neither a program nor a book, and is unwieldy to programmer and reader alike - indeed, literate programming never caught on.

The software I really admire is software which has reacted to change, modernised, improved. Even Microsoft Windows is a great example of stability in the long run. TeX isn't, because it hasn't done any of those things. It still compiles - and its very good a typesetting - but as an exercise in writing code, it offers few relevant lessons to the modern programmer.




Stability in the sense that, if you wrote something against TeX and find bugs, the bugs are yours, not TeX's. There are plenty of programs which "have not changed" but retain bugs.

I did not say that the life of TeX was bug free. Just that its current status is essentially bug free. Now, I do grant that a large class of bugs, namely regressions, are renderred impossible by the development style. But, I personally think there may be lessons in that.

For your speed claim, you need more than "it doesn't work how I think it should." Numbers, or you are just pipe dreaming. Clojure is relevant here, because it is a recent attempt to modernize TeX. It is slower, by the author's admission. Any examples that aren't slow?

I mean, yes, I understand your point about I/O being somewhat of a red flag for speedups. I'm curious why it has never panned out that this low hanging fruit has dropped.

For d), I just have to disagree. As someone that doesn't even know Pascal, I found it approachable. Are there parts that are tough? Sure, it is a full product. Try reading parts of git's source. (Granted, the parts that get heavy math in reasoning about are particularly tough, but I consider that my failing.)


Why you're saying is that you don't mean "stable", you mean "bug-free". You listed them as two separate points originally - there is a difference between them, conflating stable to mean bug-free isn't useful.

You did literally claim that TeX was "bug free". It's entirely unremarkable that with no new features added to TeX in 26 years that eventually most of the bugs have been fixed - and theres been at least one bug almost every year since 1977. If anything, the lesson here is that TeX almost certainly still contains bugs.

TeX's slowness can be measured by measuring how much time TeX spends making I/O syscalls. You can pass debugging arguments to TeX so that it will log more information about which phase of the processing it is at - TeX spends very little time typesetting and most of its time reading and writing auxiliary files. Because a document with cross-references must be re typeset 3 times, and most of TeX's running time doesn't consist of typesetting, we can be fairly sure that TeX is almost 3 times slower than it needs to be in this case.

The I/O situation is difficult if not impossible to resolve without completely rebuilding and redesigning the TeX program the ground up - it's a macro processing system, and it has many thousands of lines of macros which need to be processed for even a simple document. When TeX was designed it was impossible to do most of this in-memory, so the entire design of TeX is built around processing streams of data with as little state as possible. TeX processes one page at a time, and all its global state is based on that assumption - certainly not low-hanging fruit.

d) Pascal is a great little language, once popular for teaching students. But TeX is written in WEB, which adds a fairly complex macro layer which is then utilised by Knuth in a very non-linear manner (with respect to the code - the narrative is linear). But it's not just the language which makes TeX tricy to read - it's how it was used, the entire program is one giant global state which is manipulated by a handful of giant functions which perform most of their logic using GOTOs - that was normal for the 1970s but it's unquestioably very difficult to wrap ones head around. At least with git I can skim through the source code and understand it's structure as a piece of code - even if the meaning is still difficult to comprehend.


Stable in that it is stable. Not only bug free, but not constantly shifting under your feet. I don't know of a better word than stable for this. And I do feel that "bug free" is a necessary condition for it. Just as I would feel that a foundation for a house is only stable if it is not moving and free of defects.

I claimed that TeX is literally bug free today. If you feel this is not the case, find a bug. I did not claim that TeX originated bug free. At least, I did not intend to make that claim. Apologies if it read that way.

We can not be certain that TeX is 3 times slower than it needs to be in that case. In that nobody has produced a system that is 3 times faster. Seriously, I am a fan of incremental compiles and multipass processing of documents. I am also more and more a fan of empirical results. Claims that things should be better, without any hard evidence, bother me.

And I think we just have to agree to disagree regarding WEB. I have found TeX far more understandable because of WEB than many other pieces of software. Specifically, the linear narrative is far more of an aid to understanding than any of the newer abstractions that are promoted elsewhere. Sure, the code is rather nonlinear. Outside of almost academically simple programs, I have seen few pieces of software where this is not the case.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: