Hacker News new | past | comments | ask | show | jobs | submit login
LaTeX Cookbook – Collection of LaTeX Recipes (latex-cookbook.net)
239 points by gjvc on Dec 24, 2021 | hide | past | favorite | 103 comments



LaTeX needs to be updated. It’s slow, errors need to be better handled, packages need to be improved, language should be more expressive, etc.

There is beamer for presentation that is a pain to work with. Sometimes the most that I can understand from an error message is the slide in which the error has occurred. Sometimes I even have to incrementally delete the text and compile the remaining text to identify the slide location. Sometimes I can’t even abort the program with Carl+C or D or X after the error has occurred, and I have to close the terminal (maybe IDEs are better, but that’s another can of worms). It’s also super slow. I am really in pain working with it quite often.

A lot of times, producing the intended behavior in latex requires complex hack of internals (see Stack Exchange answers) because the feature is not natively supported.

It seems software engineers don’t care about LaTeX (and don’t use it to know about its problems). Academics are also not expert in software to fix it. The pace of development thus has been slow.


> LaTeX needs to be updated.

If you would like to support the ongoing work happening with TeX, LaTeX, and friends, please consider joining a user group. You can join the TeX Users Group at https://tug.org/join.html, or there is a link to instead make a donation at the same page. TUG does a lot; see https://tug.org/activities.html, including supporting the LaTeX3 project and the development of LuaTeX over the years, as well as work on accessibility.

If you prefer, there are many other users groups centered on location and lanugage. See https://tug.org/usergroups.html.


Excellent! This is a constructive suggestion, instead of the rants about LaTeX that amount really to nothing, because the whole community around LaTeX will not stop using it just because some outsider doesn't like the way it works.


Maybe it's you? I don't really have any of those problems. LaTeX works great for me (I use TexShop on a pretty old mac mini) You seem to be using it in the terminal. Why?! TexShop (I hear there are a few other, equally good GUIs) has been nothing like a "can of worms", but a delight to use - I can't think of any problems I've had with it, in 5+ years. Not having Command-Click to be able to instantly shift between somewhere in the LaTeX file and the corresponding place in the PDF and back would be a pain, I imagine. Or not being able to program your own macros. It's slow? Compared to what? It typesets 100 pages what seems like pretty fast to me. You know you can pre-compile all sections of a long text except the one you're working on, right?

Why expect every one of the million features users might want be "natively supported"?! What you describe as "a complex hack of internals" is usually in practice a 30 second visit to Stack Exchange and a cut and paste. Although I haven't needed to do that in a long time. I was amazed when I started (it did take about 3 months of intense learning in the beginning) that every package I needed or wanted was already on my computer, and usually the author was on TeX stack exchange answering questions!


LaTeX has a number of issues:

- Compilation feels almost non-deterministic at times. You should not have to recompile your document 2 or 3 times to get the final behavior, yet that's pretty standard practice. References / bibliographies don't work the first time through, and often you compile the bibliography separately, which is just crazy to me.

- Compilation errors are really non-helpful. In programming, if I omit one side of a bracket or parentheses somewhere, I'll almost certainly get an error with a reference to the line number right at or below the issue. GOOD LUCK finding a missing bracket in a LaTeX document based on the error messages when compilation fails.

- Lots of pretty simple behavior that really should be standardized and built in requires additional packages. The real problem comes when combining multiple packages... It's like how medications interact. It's almost hopelessly intractable to try and predict how multiple packages will interact with each other, so the way you do a simple thing in one document may change in another document depending on the combination of packages.

- >What you describe as "a complex hack of internals" is usually in practice a 30 second visit to Stack Exchange and a cut and paste.

I'm with the person you were replying to on this. There's a lot of really hacky stuff to get a result that really should have been built in. Like, a built in quote or excerpt for example, should just be a standard style like: /begin{quote} Quote \end{quote}. But instead you have to do it as a \parbox[center]{hope-you-guessed-a-good-width-mm}{Quote}.

- It's a small thing, but the way you do quotes is pretty silly. Not a fan of `` '' instead of "". If that's really how I have to do it without including additioanl packages, why not at least have a built-in like: \lquote and \rquote ?

I am normally a person that is deeply opposed to language updates, unnecessary feature updates, new frameworks, etc. But in the case of LaTeX, it's one of the most obvious examples I can think of where it's desperately needed.


"You should not have to recompile your document 2 or 3 times... References / bibliographies don't work the first time through..." If you think about it, you'll probably see why. Cross-references can't be set until you know what they refer to, which may be later on in the doc ("see section 8 below"), and setting such references (or citations) can change the page numbering. If the multiple passes (which are analogous to the multiple passes a compiler makes) bother you, then use a tool that hides them from your eyes, like arara. That's probably what Word does under the hood.


I think what OP is suggesting is that the multiple passes should be done by the executable itself. They shouldn’t have to be abstracted away be some other tool. TeX already does two or so passes for layout IIRC.


This, totally. Nobody cares—or should be forced to care—whether a given compiler does something in a single or multiple passes, what counts is ergonomics and overall speed. FWIW when I still worked with TeX (XeLaTeX) I soon built my own wrapper for the executable that simply compared the sha1sum of the relevant *.aux file from before with that from after the run and would loop until the checksum wouldn't change any more.


Seems like rubber [1] is a commonly-used wrapper for this sort of thing. Works well in a makefile.

[1] https://gitlab.com/latex-rubber/rubber/


> Compilation errors are really non-helpful. In programming, if I omit one side of a bracket or parentheses somewhere, I'll almost certainly get an error with a reference to the line number right at or below the issue. GOOD LUCK finding a missing bracket in a LaTeX document based on the error messages when compilation fails.

You are right but I think is not trivial to solve because contrary to programming languages LaTeX is interleaved with natural language and so almost everything is possible. Maybe the best approximation is to have heuristics. For example start with a valid document, introduce errors and train a neural network to identify the problem. Overkill?


While we're already using machine learning and neural networks to solve this problem, why not use a blockchain? /s

TeX being "interleaved with natural language" is not at all at the heart of the problem IMO. Every HTML document does so, too. The real problem lies much deeper. TeX uses recursive rewrites of the code that is the document to achieve what it does, without properly keeping track of where it is and what it does. There is no technical necessity for doing so, but the technique introduces huge complexity costs. On a somewhat related note, TeX's syntax at its very heart is also not trivial to parse short of executing (compiling) a given document (i.e. TeX's syntax itself is Turing-complete). This has been done only because the author got carried away by the coolness factor of it. Already a few years after TeX came out, Leslie Lamport implemented LaTeX on top of TeX to make TeX more approachable to less dedicated / insanely gifted users; he chose to dial back on this very aspect—the inscrutable syntax—and proposed a much more regular syntax as a convention.


Tex being Turing complete is what allowed LaTeX in the first place. HTML not being Turing complete requires Javascript and CSS separately. Do you find them easier? Maybe, I'm not sure, but you cannot escape your language being Turing complete because the range of possible documents is so vast.


This doesn't sound right. First of all, I'm not complaining about TeX being Turing-complete, what I'm saying is the syntax is unnecessarily hairy because TeX's syntax is Turing complete, meaning there is no static grammar production that would allow you to recognize all the macros / commands / 'active' content parts in a given document—you have to typeset (compile) it, only then you can know. Nothing is gained by this, not in theory and not in practice; it's just a burden.

To say that "HTML is not Turing-complete, therefore CSS as a separate language is required" is not even wrong and I leave it to the readers to fill out the blanks.

To say that "HTML is not Turing-complete, therefore JavaScript as a separate language is required" is more or less correct although to be fair we've seen many, many features such as animation being absorbed by declarative CSS that had to be implemented in imperative JS so in that regard the need for JS has diminished.

I can say with certainty that a full fledged programming language (i.e. Turing-completeness + usability) is absolutely necessary for a viable solution of document production. And while one can and people do produce TeX and HTML output programmatically, this does not obviate the need for a typesetting engine that can tell you where you are and what the circumstances are while typesetting the document. JavaScript has grown many abilities in this direction, but TeX in comparison is, after almost 45 years in development, still lacking and CLUMSY AS HELL for crying out loud.

Nothing is easy, simple or straightforward in TeX. Numbers are difficult, units are difficult, command syntax is difficult, namespaces are difficult (because there are none), variable names are difficult, conditions are difficult (there are like hundreds of specialized \if s), knowing in which column you are is difficult, fonts are difficult, character encoding is difficult, keeping constant line heights is difficult, positioning floats is difficult, the processing model is difficult, the very syntax is difficult.

I've said it before and I will say it again: TeX is a computational typesetting system, but it turns out it's quite bad at dealing with numbers, strings, and conditions, the basic building blocks of computational typesetting.

TeX is an utterly broken system because every single aspect is a convoluted mess.

It does get a lot of praise for its incorporation of the Knuth-Plass line breaking algorithm, and rightly so. However this beast, too, is hard to handle unless you just go with the defaults. It will happily give you overshooting lines because Knuth decided to have a word jutting out the right margin was a better solution than unsightly spaced words or letters. Most printers would disagree and just bite that bullet where necessary.

Yes, TeX will issue a warning in such cases but this is almost insubstantial because TeX keeps warning you about many, many things in the same output stream as its informative messages. That stream is almost impossible to parse for humans and machines alike. To make it even harder TeX's console output is hardwired to wrap at column 80. Even TeX's error messages are a convoluted mess.

/rant

And to come back to your original question, yes, I do have the strong suspicion that the separation of styling (CSS), content (HTML) and programming proper (JS) into three distinct compartments with differing syntaxes is close to the local optimum that we as people in this time and age can and want to deal with. If anything the syntax of JS and CSS could perhaps be unified (CSS is 90% JavaScript's object literal syntax; it adds selectors, allows hyphens in property names, and has quite a few notational amenities on the right hand side; other than that, it's like object notation). One can also imagine to express CSS in HTML which would be a bit clumsy but still somewhat usable. However, to unify markup syntax with imperative syntax is almost sure to fail. JSP (remember those?) and embedded PHP are two examples for how we do not want things to be. JSX looks a bit more viable. Svelte does this very interesting thing where they have a single file define the markup, the styling, and the behavior in a single file with three distinct parts; to me that looks like the future.


I don’t want to learn one specialized IDE for every application. I use Vim and Emacs for coding in general. There are of course plugins, but they didn’t notably change the experience for me.

The speed problem is mentioned in the Beamer’s paragraph. It’s true to some extent for latex too, but becomes problematic with beamer (lots of figures with Tikz and pgfplots etc). And I have to use beamer, because the source documents are in latex.

When you consider collaborative writing, Overleaf is slow even with base latex.

If you use it heavily, you will see its issues. Part of the problem is that it’s not consistent. In Python, when I type print(“LaTeX”) the answer is always the same. In latex if I type, \hspace*{5mm} it depends on many things. This becomes specially problematic when the effect you are trying to produce is not covered by a common template; the language is not fully rule-based either so that you systematically produce it yourself. You have to try various things.


>You seem to be using it in the terminal. Why?!

Not OP, but having similar problems as OP. One of the reasons I use(d) it in the terminal is because I use make which also generates figures for me. I also don't like having to learn X different GUIs for Y different programming languages and prefer using vim and make if possible.


This is my reply also to aborsy: I also don't use a different IDE for every programming language, just VSC or Sublime Text for them all. I guess I think of (La)TeX as a tool for writing books/papers, not just another programming language. TexShop (and I guess the other GUIs) is not like vim or emacs, with a million key commands to learn. Most of the non-standard key commands I use are ones I defined myself, like e.g. Command-I to make selected word/s italic. Anyway, refusing to use (La)TeX the easy way, then complaining about how hard it is to use, seems hardly (La)TeX's fault. We have this incredible publishing technology..and you refuse to use it?!

CorrectHorseBat: "I use make which also generates figures for me" - I'm curious how that works! Sounds very interesting.


>Anyway, refusing to use (La)TeX the easy way, then complaining about how hard it is to use, seems hardly (La)TeX's fault.

Not generating clear error messages, or apparently requiring a gui as the easy way is definitely Latex's (and the tooling around it) fault in my eyes. Latex files are text, just give me a decent compiler and let me use my own editor. If that is too hard there's something wrong.

Also to me all the syntax is unnecessary cryptic, working with "variables" is a pain, nice modulatisarion is non-existent and probably a few more things I forgot.

>I'm curious how that works! Sounds very interesting.

Nothing really fancy. I have python scripts that generates figures which are included in my pdf. I use make to stitch everything together.


It’s mostly because LaTeX is deemed mature for its audience and features like these, while flashy, are not going to see much use day to day.

Most academics want to take plots they make in Matlab or whatever and put them into the template supplied by a journal and get an elegant looking document with well typeset equations and a bibTeX bibliography.


Tex support by conference and journals is declining though (engineering).


I hope this comment was made in jest or sarcasm. No respectable conference has replaced LaTeX with Word. Word is just a nightmare with typesetting scientific documents, with requirements for precise/fixed table & figure alignments, gutters, margin delineation etc. Although a word template is provided, almost no one uses it for the aforementioned reason.

Feel free to check this information's trustworthiness from any top-N engineering conference proceeding. Any year of your choice. It is easy to verify by looking at metadata for the PDF engine that made the document.


Maybe this was just a glitch in the matrix: https://web.archive.org/web/20210419133815/https://irmmw-thz...

"We do not provide LaTeX templates as this generates technical issues with the IEEE process. If you nevertheless decide to use LaTeX,...", seems like a slight discouragement.


*This* is your pick for a top-N conference in engineering? Seriously, a random conference with motley of papers from all kinds of EE topics self-hosted in China. If you are making a joke argument for argument's sake, merry Christmas & please humor yourself elsewhere.

IEEE does not provide checking or process their sources because they pay only the token amount to keep IEEE banner. (IEEE recognition for local events and their commission/fees is a scam, but that's not what we are here for. These institutions pay a base charge of ~$3000 USD to get their paper indexed in IEEE Explore. It also gives name rights to use IEEE in conference name. For several other conferences, they also take 25% of the attendee's registration fee).

Source: Personally, in programme committee of two recognized international conferences: [International Conference on Image processing (ICIP) & International conference on Pattern recognition (ICPR). You can look up the validity of a conference on Google scholar metrics)]


Thanks for the input. How does one find the validity of a conference with google scholar metrics?


Google Scholar Publications [1] This indexes venues by impact factor, citations etc. Basically a proxy for how established is the venue. You can further filter by application areas in Categories & Subcategories. For example [2] happens to be list for Computer Graphics only (among all other CS area)

[1] https://scholar.google.com/citations?view_op=top_venues

[2] https://scholar.google.com/citations?view_op=top_venues&hl=e...


I haven’t noticed this at all.


Really? Which venues dropped support?


What else are they moving to then?


Word.


I've heard this before but I have not really seen any examples of this. It seems like virtually every paper I read has been rendered with LaTeX.

I would totally be fine with LaTeX being replaced, but I hope to god that I'm not forced to us MS Word all the time.


Have you tried the WYSIWYG TeXmacs? (It's not based on TeX nor emacs but is inspired by both.)

See: https://www.youtube.com/watch?v=H46ON2FB30U


At that point just use LyX https://www.lyx.org/


TeXmacs is vastly superior to LyX, but, more importantly, it is a completely different thing, with a completely different concept. The following is my opinion, which I cannot motivate in a detailed and well-funded way, based on experience of a few years of TeXmacs use and more of LaTeX/LyX: given the difference between the two programs (TeXmacs is not based upon TeX!) LyX does not have the potential to reach the quality of TeXmacs.


There is also context: https://en.m.wikipedia.org/wiki/ConTeXt

It always gives me the impression of beiing better structured, and it doesnt have module architecture (so no conflicts). Unfortunately when a couple of years ago I tried to install it on windows the installation was extremely painful, and the documentation I found lacking (unfortunately).


I use ConTeXt. It is a perpetual moving target. If you want a document to keep compiling to the same result, you'd better pin a version :-/

They keep on adding things, experimenting things, sometimes removing things, changing engines. If they think of a feature or someone asks for it, they implement it right away as a fix. They are very nice and helpful, but the problem is that those small features and fixes are not sufficiently thought out to maintain consistency. They are just there to get things done on the moment.

The documentation is mostly at introduction level, whether it is named that way or not. It is almost never exhaustive, it is often not up to date. Often, the options are not described, you just have their name: no idea about what they do, let alone how they interact.

Because things interact there too, and as with LaTeX, it doesn't always interact well, despite the more integrated approach that ConTeXt proposes.

It (more and more?) shares some of the problems of LaTeX. LaTeX has many different packages for the same thing (let's say tables) and none of those is complete? Same thing in ConTeXt nowadays, there are several table environments, and none of them is a pure improvement upon the others. So a package/environment adds a feature compared to an existing one, but it doesn't include all the features of existing ones. So you often end up with the same category of problem:

-- I want to do A => use this package/environment! -- I want to do B => use that package/environment! -- I want to do A and B => you're out of luck!

Yes, I am a bit, no, a lot pissed at the moment:

a. by my constant or growing struggles to get some stuff done on ConTeXt and LaTeX, stuff which doesn't look from outerspace and seems it should be handled out of the box, without needing to get dirty with low-level macros just to get stuff to behave consistently (because often said stuff actually works in some cases);

b. by the lack of robustness those tools are still showing, despite decades of massive time and energy investment by people who have a much better understanding of these shenanigans than I have. This is emphasised by my recent attempt to come back to LaTeX and the subsequent finding that I am hitting the same kind of exhausting inconsistencies I was hitting 10 years ago.

I am so depressed about the situation that I think that it must be impossible to build something robust upon TeX as a programming language. TeX the composition algorithm is probably fine, but building something based on this fragile, clunky macro language is probably destined to end up being fragile and clunky, in spite of the original intent. The automation of composition is already a difficult subject with a great many moving, interacting pieces; having to deal with such a rough language on top of this...


Thanks a billion! Probably you saved my Christmas holidays.


I spent several years as a LaTeX user during graduate school. I spend some time in LyX as well.

Then pandoc was released, and markdown let me be more expressive with content than dithering with minutia of layout. In that vein, I feel LaTeX is awesome but requires a comprehensive understanding of all of it to be fully expressive. I just want to type my papers quickly, ya know? So Markdown gives me what I need, and if a coauthor needs LaTeX pandoc can suitably provide.

I keep my resume in TeX, for what its worth. I suppose I keep my half-written data science book in it too :)


The modern way to use this is through Overleaf. It is just a breeze. I had to create a new Beamer presentation from old material, and in a few minutes I was able to select a good looking theme, import my material, and add pictures to the slides. No need to bother setting up a LaTeX environment or dealing with error messages directly.


Like amichail said in another comment, you could give a try to TeXmacs (www.texmacs.org). It is a WYSIWYG document preparation system which is at the same time structured and can be completely programmed by the user. The name comes from the fact that it wants to reach the typographical quality of TeX (the author claims that it _surpasses_ it) and the flexibility of emacs. I am using it since a few years and I think the reason it is not more widespread is that - people do not immediately see its potential - LaTeX unfortunately cannot be exported reliably b/c of the Turing theorem (really!i I am taking this on the word of the developers but they are established mathematicians, so I assume they know), and so traps users

A little bit of trying TeXmacs and you could like it!


I can recommend Tectonic (https://tectonic-typesetting.github.io/en-US/) as an alternative to directly invoking the LaTeX (or in this case XeTeX) compiler.

The build experience is magnitudes better than LaTeX or latexmk.


>Sometimes I can’t even abort the program with Carl+C or D after the error has occurred, and I have to close the terminal

In my experience, whenever Ctrl+C doesn't work, LaTeX can be terminated by pressing the X key and then Enter. (Look for the "Type X to quit or <RETURN> to proceed" in the error message.)


The problem with updating TeX is that everyone has a different idea as to how TeX should be updated, is it eTeX, pdftex, xetex, luatex?


To quit when the error handler is asking for input, enter x and RETURN.


Anyone who doesn't have a long list of problems with LaTeX doesn't use it enough, or doesn't go beyond the built-in templates. Unfortunately, those templates are designed for print and not screen viewing ... so you really have to edit them.

I'd be happy to argue this in detail, but I see that others have already hit many of the big points. (Some more: The typesetting is suboptimal, even if you improve it with the microtype package; the default symbols for real and imaginary part (\Re and \Im) are horrible; the cleverref package should be built in; the geometry package should be built in; the artificial distinction between equation and align environments is because the engine can't space properly around aligns, so this should be fixed and they should be a single command; \left and \right for delimiters often make really bad choices; the default font is the only good option if you need matching mathematical characters, and it grates on you after years of use; I could go on for hours...)

The more interesting question to me is: What should be done about this? LaTeX seems too embedded in the scientific community for there to be a clean break to a new technology (see, e.g. arxiv.org). Also, I'm not sure an entirely new markup language would be desirable, since the core LaTeX language is actually not so bad – it's the transformation of the markup to the final document where the problems occur. What we need is a new backend or shim layer that keeps the basic language but allows for more aesthetically pleasing rendering and better control of the layout.


I don't really see the point of using LaTeX for on-screen viewing, well, not unless you want to impose some amount of focus on the reader (which printed forms usually do). HTML or markdown for on-screen content usually works better.

The main asset of Lamport's LaTeX is its high quality formatting for printed artifacts. Most of that in turn comes from Knuth's TeX, which is a much lower level language. One can do (and many definitely have done) markup to TeX languages, but they never expose as much control as LaTeX does, so aren't very popular. Madoko (https://www.madoko.net/) does pretty good, however, so maybe try that? Otherwise, there are plenty of commercial and expensive typesetting systems out there, but besides the cost, I don't think they are even as user friendly or flexible as LaTeX.


I agree that ideally we should also provide an option to render into HTML/markdown. But for practical reasons it's necessary to have a digital artifact that matches the printed artifact in page and line numbering.

I disagree that "The main asset of Lamport's LaTeX is its high quality formatting for printed artifacts." The main asset for the scientific community is that it lets them (relatively) easily write research papers that are dense with figures and mathematics. (And these are typically viewed online, via arXiv.)


> But for practical reasons it's necessary to have a digital artifact that matches the printed artifact in page and line numbering.

Unless you are distilling to PDF, this is not viable. But I assume you mean PDF output, then you need a screen form factor that is paper-like, like an iPad or some other letter/A4 style tablet.

You can manipulate math for on-screen documents as well (MathML), but (a) it isn't very viable to use them for both formats and (b) people who pursue math publications mostly love the focus of paper formats (even if they view them on a screen as above).


"But I assume you mean PDF output, then you need a screen form factor that is paper-like, like an iPad or some other letter/A4 style tablet."

Why do I need this? Concretely, I go to the arXiv. I look at the paper a bit on my laptop. Then I print it and read it away from my laptop. The printed version has the same pagination and layout as the screen version.


> Then I print it and read it away from my laptop. The printed version has the same pagination and layout as the screen version.

You can totally do that, and that is the most common case. However, if you had a letter-sized device perfect for viewing PDF papers, it wouldn't be "weird" anymore, the pagination and layout would just feel like a good fit.


> But I assume you mean PDF output, then you need a screen form factor that is paper-like, like an iPad or some other letter/A4 style tablet.

iPad is closer to A5 than letter/A4, but most things that are intended for letter/A4 are probably tolerable on it for most users.

iPad Pro is almost exactly US letter size.


Here's how a real man cooks with TeX.

* Carefully sets out header with a tasteful collection of packages.

* Sets all tables thoughtfully using \booktabs, never dreaming of using vertical borders.

* Uses wysiwym everywhere, \emph for emphasis not \it etc.

* Gets an incomprehensible error because he copied a single Unicode character.

* Throws TeX document out of the window, opens text editor, types out markdown.


Millions of people, including me, disagree. LaTeX has allowed me to write articles, books, and other documents for decades, and I have not to worry if it will be there in 10 years or more. It is the work of genius (Knuth), and continues to be widely used.


It’s used by millions of academics with Stockholm Syndrome.


Have you tried to write any non trivial document with MS Word? It is infuriating. Even the simplest tasks don't work reliably, like placing a picture with a caption. And oh my god if you try to Google for your problem you end up on outdated MS doc pages or what the pages describe only works in the desktop app and not the online version.


And since you brought up Word, I'll add that it has never in 30 years done numbering right, whether sections (like legal numbering: 1, 1.1, 1.1.1 etc.) or lists. In fact I think it's worse than it used to be; I was tearing my hair out recently trying to get it to number lists, which is such a simple thing.


> Have you tried to write any non trivial document with MS Word?

The last bullet point in dash2's comment specifically says "markdown". Why would you bring up MS Word?


There aren't many academic articles written in markdown. Most major conferences only accept word or PDF files, so it's a reasonable assumption to make.


It's used by millions of academics because it is literally the best option, warts and all.

Markdown is vastly under-powered for a lot of things people need to do. Word is a hot mess for the same. Etc. etc.


ReST has TeX level of power and it's internal data structure maps mostly 1:1 onto DocBook if the native output capabilities of docutils aren't good enough. It's a shame Markdown took over the mindshare for ASCII markup.


One needs a different perspective. It is used by millions of academics who did not yet try TeXmacs.


I use LaTeX. I tried TeXmacs out of curiosity. It’s not good enough, by far.


Just out of curiosity, how much time ago did you try it? On the one hand the latest version, 2.1.1, runs very smoothly and has a complete set of features (the extensibility of course allows to augment them, just like in TeX); on the other hand TeXmacs has very different principles than any other document preparation system and need just a bit of time (far less then LaTeX) to get used to. This blog post by Massimiliano Gubinelli may be a good introduction: https://texmacs.github.io/notes/docs/art-of-math-writing.htm...


LaTeX is not made by Knuth, it's from Lamport.


LaTeX is just a set of packages that run on Knuth's creation.


Or uses TeXmacs



Well, fuck. Where was this when I was in grad school?


tenuously...


Crisp. That is the word that pops in my head whenever I see something that is generated by LaTeX. I am not sure why or how to define it.


It's true. There's an accumulation of small details involving spacing that LaTeX gets right but something like Word gets wrong, each of them individually almost imperceptible, but adding up to an effect where the LaTeX document looks somehow clean and correct, and the Word document looks like a big mess. I can detect a Word document almost immediately on sight but it takes me a while to follow up and find the concrete evidence of Wordness.


Somehow it’s “characteristically ugly” for me, looking at these examples specifically. You can get good-looking documents in LaTeX if you customize formatting just a little bit, but the default colors, shapes and even fonts are just off. Especially Beamer.

It’s really like a style of clothing: a sort of old man’s suit that looks normal in academic institutions but in most other settings you do look like you’ve just come from that institution.

The typesetting is much better than anything in common (non-design-professional) use though.


I agree with you what you said.

On the flip side of the coin, I would say don't attempt to customize it. LaTeX out of the box is essentially what you need for an easy to read document.

There is no LaTeX graphic designer out there. I would agree to a little defeat and say the poster example on that website looked bad. If customizability is something people are looking for LaTeX is not the ideal choice because the learning curve for advance use can be incredibly steep, you may not have a good eye for good graphics if you are using LaTex in the first place and you might compromise early and say good enough.


100% agree. Some people have a fetish for wide aspect ratio fonts, such as some of the Computer Modern family. Must have been the fashion at some point.


you sure it’s crisp? Not another word starting with cr and ending with p?

Seriously, TeX is a bad maths language [http://xahlee.info/cmaci/notation/TeX_pestilence.html], surrounded by a terrible document markup language, wrapped in tooling straight from the 1980s, plus a community only too willing to tell you how your question is wrong.


> a community only too willing to tell you how your question is wrong

Been there, experienced that. Like this one poor guy who dared to ask at the TeX.StackOverflow help desk—"what can I do to get this and that vertical line in my table", only to get lectured by one acolyte, "Oh noes, don't, vertical lines bad, only horizontal lines in tables, proven to be the only way to go". Which totally throws centuries of finely typeset mathematical, astronomical and nautical tables out of the window while not answering the question.

To be fair, there are quite a few people on the TeXSO that go to incredible lengths to give extensive answers—expositions really—to hairy questions ('hairy' being a technical term here which translates to 'daily' in the context of TeX).


haha, I don't have much experience with mathematical equation writing in LaTeX, I just like how the words and document look. So I can't say much about this.

Have you looked into Groff >> https://tex.stackexchange.com/questions/527864/latex-vs-grof...

I have heard atleast one people say that, the math syntax is much more elegant in groff as opposed to LaTex.


In the 90s, I used to imagine the future to be one where "browsers" would be much less than combined JavaScript execution / DOM+CSS layout engines and hence able to do much more by downloading (signed) decoders/renderers on-demand for different content types, of which DVI would be just one of many.

There are still huge conceptual leaps to be made in the notion of a computer as a document creator and content rendering, and Smalltalk provides some clues in that direction, though I doubt we'll see it in my lifetime, considering the weight of the last twenty-five years will make it difficult to change course.

The relevance of the above flight of fantasy to your point is that LaTeX, for all its awkwardness, is absolutely a local maxima in document creation, and I love seeing its results appear from the markup, knowing that it's following the numbers precisely, as if by magic, and print them out and marvel at the quality of the fonts close-up, and we need more such systems with which to create and render content, before denting the DOM+CSS hegemony in the manner I described above.


The whole issue with LaTeX ia that it renders to PDFs, which aren't appropriate these days any more for most electronic documents because you want reflowable content rather than fixed layout. And (m)HTML is great for that and can include other content types :

https://eater.net/quaternions


Please show me a reflowable document with decent typographics. No, seriously, why browsers don’t implement fully-featured layout algorithms? Until then, I’d prefer a well-typeset PDF over an average website for a longer-form and graphics-rich content.

Also, those are naturally usable offline, are durable and have no interactive annoyances.


Hmm, do you have any more specific examples of typographics being a deal breaker ?

MHTML (aka .eml) is usable offline too (except for some reason by Firefox ?!?), while PDFs are horrible on screens smaller than the width of a page.

I'm not sure what interactive annoyances on scientific websites you're talking about - the most problematic ones these days are those with the assumption of a working Java or Flash plugin (and if they start to be plagued by irrelevant scripts it's much easier so far to block JavaScript in a browser than in a pdf reader), meanwhile many pdf readers don't even have any animation support, which is about content rather than presentation ! (As a metaphor : would you support a format unable to show pictures ?)


Your point is spot-on. Reflowable text is incompatible with most scientific writing and with anything more complex than simple prose.


I didn’t say this. I believe it’s current implementations that are not good enough. Some ebook reading software comes close.


I'm afraid you rather missed the point in the last paragraph -- "we need more such systems with which to create and render content, before denting the DOM+CSS hegemony". Fixed format is much preferable for textbooks or other publications which have many diagrams as part of the body text.


Textbooks, sure - other publications with many diagrams - that's my whole point, no (the fixed format often makes understanding worse because they can't just put the diagram in the specific spot that refers to it, and finding that spot is additional work) - also pdfs only really support fixed, non-interactive diagrams.


As far as I know the main issue there is that LaTeX is a Turing-complete language and no reliable complete export filter can exist, so in particular one cannot export reliably all of LaTeX to HTML


I don't think Turing-completeness implies any issues when converting programs between equally powerful automata. In fact, a lot of results in the theory of automata / early computational complexity take a form of converting all programs of machine A -- in fact the machine itself -- in order to run it on machine B.


Please take a look at the following comments thread https://news.ycombinator.com/item?id=27820466, in particular at https://news.ycombinator.com/item?id=27822662 where Massimiliano Gubinelli (one of the main TeXmacs developers) explains the issue.


In theory there should be no difference between theory and practice, but in practice there often is. TeX's syntax itself being Turing-complete means you can barely start to do static analysis on the code before hitting a wall you can only get over by installing a huge fraction of the ~1GB download that is TeXLive. "Being theoretically possible" is in no way orthogonal to "impractical in many or most circumstances".


> hence able to do much more by downloading (signed) decoders/renderers on-demand for different content types, of which DVI would be just one of many.

I added DVI rendering to NCSA Mosaic in 1994. Didn't go down well with anyone. Probably a good thing overall, but I still think it was a nice idea (multiple different renderers).


> Crisp.

Please open any of the publications in https://www.texmacs.org/joris/main/publs.html (pdf versions) and then get your own opinion on whether anything else is crisp ;-)


> https://www.texmacs.org/joris/surhypexp/surhypexp.pdf

I saw "France" at the top of the document and I quickly scrolled the document thinking, "Looks crisp French to me."

Then it dawned on me.....


The problem that I’ve with latex is that it predates most of the modern software development.

Everything is in a giant namespace. Package management is a mess and packages often conflict.

No numeric characters in method names. If you want to use numbers, you can only hack it using names like \WidthXV.

No functions in the modern sense. Everything is just macro expansion.

Turing complete but hard to write any useful code. I have seen more code written in Brainfuck than LaTeX.


Every. Damn. Single. Thing. You. Said.


I would love to see a combination of Markdown + Pandoc used to generate LaTeX whilst keeping the writing process content-centric.

I find the mathematical typesetting functionality of LaTeX to be incredible, but anything else (tables, columns, etc.) to be an absolute nightmare.

Pandoc allows for Lua filter to pre-process an AST generated from raw Markdown, and can then convert to 'traditional' LaTeX - why can't we utilise the best of both worlds?

For example, I used Pandoc to create Beamer slides in Markdown, with nice `div` syntax and maths typesetting - the advantage was I could focus on the semantics (and have a directly readable document) without any LaTeX fluff that was purely aesthetic.


I have a lot of fond memories of working with LaTeX in the 90s in my undergrad. These days I wouldn't use LaTeX because I want something quick and dirty.I am not investing a lot of time in writing documents that people are not going to spend a lot of time reading.

That said, The Lamport book I used heavily in my undergrad still has a place on my bookshelf.


Arguably if you spend a lot of time writing a document it becomes more likely that others are going to read it.

I am a peer-reviewer and an editor on a few academic journals. In my field, a mix of authors use [Xe](La)TeX and Word. There's a definite quality bias towards the TeX articles -- not because of the typesetting per se but rather because it's indicative of a lot of time spent carefully preparing a document to a standard. I'm sure others in similar fields have similar experiences.


I love my 86 Lamport. The cartoons add such a nice touch.


See, that might be cultural background or whatnot, I don't know, but some people find those cartoons to be a nice touch, and they admire the clarity of exposition of The TeXbook.

I'm the proud owner of the entire 6 vol Computers and Typesetting series (of which the former title is vol A), and have worked through sizable portions of those. I also read mot of the 680 pages of Knuth's Digital Typography monograph. I also owned one LaTeX book at some point. I admire the depth of Knuth's knowledge (typographic and otherwise), the sheer diligence, his quest for quality, all of that. Those lion doodles though, the author's urge to land another witty remark, come up with another finely crafted reference, and how he's prone to hint at stuff instead of writing a proper documentation are not aspects of his work that I enjoy.


I wonder what you thing about https://www.monsterwriter.app/ ... I'm exchanging a free licence for valuable feedback :)


It's not clear from the website what this is. Does it convert text files to pdf? Please show the source text file on the landing page. Can you describe the figures easily inside the text file? It's so light on details that I don't even know if I'm interested or not.

LaTeX, for simple articles and light math, is trivial to use and problem-free. The complexities arise when you try to do fancier stuff. How much fancy stuff do you allow in monsterwriter? Does it support multiple language conventions on the same article? Can I generate monsterwriter articles programatically in an easy way? Can I prepare a pdf presentation with animations like in beamer? A poster in arbitrarily large paper size?


Blogging publication to Ghost: should include other platforms. Ghost is expensive. Is there really tables support? Ulysses is a markdown-based editor which doesn't support tables. This would be an improvement over that program. Pdf export? A4, letter, A5, or? PDF is size agnostic. I haven't tried LaTeX export. It would be interesting to write once and have both html and LaTeX export. No CSS? This might be helpful.

Just my thoughts.


Is this book available in a dead tree version?


Yes, it's available on Amazon, Book Depository etc.


This is why I use Microsoft PowerPoint for my schematics.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: