I've always wondered why there's no Emscripten-like tool, or WASM-like runtime, that shoves your code into a bytecode interpreter embedded within a portable POSIX shell-script.
With such a thing, nobody would ever have to write actual lowest-common-denominator /bin/sh scripts again. Autotools could just be a Rust program or whatever. (It'd run rather slowly, of course, but you wouldn't use it for the whole process — it'd just be a bootstrapping agent, used to configure the build of a native-compiled build tool.)
But the old systems, those with the "30 years of embedded knowledge of traps etc." - aren't necessarily POSIX-compliant. And there are different versions/parts of the POSIX standard. etc.
Id be surprised if anybody, anywhere, actually understands an entire (2000+ line) configure.ac script. I suspect that they're mostly just copied from other examples, modified a bit, and passed on for the next copyist to use.
I think you vastly overestimate the complexity of configure.ac scripts...
There are some obscure M4 things you can do, like changing the quoting system, but really it's just a macro/templating system to create a POSIX sh script so generally it boils down to running POSIX sh commands, with some templating...
Here[0][1] are some autoconf scripts I've written, by hand...
Now, none of these are individually long, but that's just because I tend to split out common functionality and use pre-written macros from the acx archive where possible. But this shouldn't diminish the point that an autoconf script is really just a knowable (the GNU autoconf manual is good) set of M4 tools to create a POSIX sh script.
> I personally don’t think that the Autotools are ... even all that much more difficult to work with than some of the alternatives
So, even one of the main contributors to autotools says he acknowledges that autotools is more difficult to work with than most alternatives, and much more difficult to work with than some of the alternatives. And - it is.
I mean, I'm not sure how far back the history goes, but why choose an arcane language like M4 to base autotools on? You had perl in 1980s, or even bash since 1989.
Now, let's discuss the listed strengths of autotools:
1. feature-based approach scales better than lists of system quirks. <- this is not specific to autotools.
2. 30+ years’ worth of embedded knowledge about portability traps for C programs and shell-based build scripting on Unix <- this is true, but by now this is significant mostly for legacy systems. Alternatives have, say, 10 years' worth of such gained embedded knowledge
3. Supports cross-compilation better than competing systems. <- I wouldn't say that. Needs to be argued explicitly.
4. Support software written in multiple languages better than some competing build systems <- Undermined by the focus on C and bash a couple of points ago, and by the fact that the author was only willing to make that point relatively, not absolutely, and cherry-pick the reference competitors.
5. Autoconf is very extensible, and there are lots of third-party "macros" available. <- This is true for most build systems. Remember, though, that the macros need to be written in an arcane language.
6. Produced tarball releases have fewer build dependencies than those produced by competitors. <- Was not aware of this. Let's take his word for it.
7. Produced Tarball releases have a predictable, standardized (literally; it’s a key aspect of the "GNU Coding Standards") interface for setting build-time options, building them, testing them, and installing them. <- I'll grant him this point too.
8. Tries very hard to generate Makefiles that will work with any Make implementation, not just GNU make, and not even just (GNU or BSD) make. <- This is a rephrasing of a weakness: Only supports Make-based building. "Lots of makes" is a poor substitute for non-Unix-Make building.
9. Excellent reference-level documentation <- I'll again grant him this, but TBH I haven't used autotools documentation so I don't know. I do know that CMake documentation is sometimes lacking or not clear enough.
10. As they are GNU projects, users can have confidence that Autotools are and will always remain Free Software. <- This is actually an interesting point. Having software that can't devolove into a proprietary variant like with permissive licenses is a good thing in general. But - *is this really a danger for build system generators?* I don't have an answer to that. I wish the alternatives were GPL'ed as well.
11. Relatedly, users can trust that architectural decisions are not driven by the needs of particular large corporations. <- And this is not true for CMake or Meson?
12a. Large installed base <- That's almost meaningless. It's not hardware.
12b. Switching to a competing build system is a lot of work. <- No it isn't. For a new project, it is almost 0 work. For an existing project, switching to CMake is _some_ work, but there's somepayoff in my limited experience.
A few more points:
> Autoconf’s core function is to solve a problem that software developers, working primarily in C, had in the 1990s/early 2000s (during the Unix wars).
The Unix wars were in the 1980s and 1990s mostly. They were dying down in the 2000s, as Linux ascended. I suppose that Solaris, HP-UX, AIX and OpenServer have their niches, but those are small (correct me if I'm wrong).
> Developers of new code, today, looking at existing configure scripts and documentation, cannot easily determine which of the portability traps Autoconf knows about are still relevant to them.
This is not actually the case. When we're starting a new software project, most of us are developing it for platforms which are 10-years-old or less (I don't mean the hardware which might be older - I mean the OS and libraries).
> Similarly, maintainers of older programs
This is where autotools remains very relevant, in my opinion. Unless its alternatives decide to invest in being more "universal" and supporting older systems/platforms.
> 6. Produced tarball releases have fewer build dependencies than those produced by competitors. <- Was not aware of this. Let's take his word for it.
I believe the intent is to say "you need autotools to build the tarball, but the tarball doesn't need autotools, just regular POSIX tools", which is a jab at systems like cmake where you need cmake to do the configuration at tarball time.
Except, well, the introduction of DVCSes like git has shifted a lot of software away from "build from tarball" to "build from VCS checkout," so the value is rather lesser. Indeed, you can turn it around and claim that this is an antifeature: it takes more work to build a tarball and do VCS checkout than not.
Developers build stuff from VCS checkouts. But the package managers and ports systems that regular users rely upon, still use tarball releases, not VCS checkouts, as their upstreams.
For example: Homebrew. If you tell Homebrew to --build-from-source, it'll pull down a release tarball from its release URL; unpack it; set up the shell environment just so; and then run the same very same ./configure script the upstream generated with Autotools.
It'd actually be kind of a problematic bootstrapping problem if that ./configure script did require Autotools to run, as there's no real legal way for (proprietary) macOS to ship (GNU, GPLv3) Autotools.
And that's much the same problem many other proprietary UNIXes were in, back when: wanting to support "standard" tooling, but being too wary of this new "copyleft" thing to be willing to ship any GPLed software as part of their OS. Thus, the broadest industry support came out for the tooling that doesn't force the upstream's tooling's licensing choices on the downstream. (Ironically, that tooling happened to be from GNU, the precise people whose software they were trying to avoid shipping.)
> "build from VCS checkout," so the value is rather lesser.
But why does building from VCS checkout make any difference? If anything, there are _more_ requirements that way, since a tarball release can possibly be made to rely on less (e.g. generating ad-hoc install scripts).
It's in POSIX because it has a specification and can be implemented independently. Perl can't be specified in a standards document. The one attempt was a failure.
Certain software becomes finished over time, especially POSIX-conformant software. It seems like today's young kids get very agitated if a project hasn't seen multiple commits per hour, but those of us who are a little older and wiser appreciate when software becomes finished.
autoconf-archive still has tons of improvements. You can still fix or add ax probes as you want or need.
autoconf itself is stable enough, no need to change anything. If they are problems, they are elsewhere, as eg. in makeinfo. Or there is no proper way to create/maintain man files from gnu docs. Pod, markdown or manually seem to be the preferred ways, doxygen or pandoc rather not.
gnulib is a huge icebreaker driving through the pack ice in a very fast tempo. But still not good enough to provide unicode support for coreutils and friends (sed, grep, find). Let's see.
m4, sh, make and friends are the shared part that every POSIX system has, without requiring any external extra tool, be it ninja/meson, etc.
Besides hand-coding a configure script that generates a Makefile yourself, Autotools is the most portable build system out there.