Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Because the core proposition value of make is the dependency graph, which then trivially allows for parallelisation with -j<n>

My core argument is that this rarely matters, and in the cases it does, it is often the case that the project's needs surpass what make provides anyway. Or they're using a very special version of make with very special supporting infrastructure, like bsd.mk.

I'm compiling some 100k lines of legacy C (for a custom operating system) plus compat shims that make the thing run on Linux and my very very trivial build script (plain posix compatible shell) takes less than 5 seconds to run in debug mode. Optimized release builds are a bit slower, but this hardly matters in day to day development. It would be trivial to add some parallelism with xargs.

Fwiw I can compile the same thing with CMake, and it's faster only sometimes. It often happens that a header changes and everything needs to be built anyway. And it often happens that the parallelism bites me back when one file gives an error which scrolls out of sight as a dozen other files are still being compiled in parallel.

I've wasted far more time fighting subtly broken makefiles (that don't pick up some dependencies and thus fail to recompile the things that need to be recompiled), and build systems with fancy declarative languages that don't present an obvious way to do what you do in one line of shell when you don't need to worry about a dependency graph.



Personally I see CMake as the best way to work with portable C/C++ these days, as it supports Linux/Mac/Windows, including library-detection and everything else a traditional autotools "./configure" script does.

That said, I'm not a fan of the CMake scripting language, which is quirky and error-prone. It should be easy to write a new 'Find' module, but it's not. There should be officially documented design patterns to make the task trivial. There still aren't.

CMake gets the job done, but it's surprisingly difficult to work with.

</rant>


Not everything else. One particular thing cmake does not do well that bugs me quite a lot is supporting source code generators, especially in cross compilations, where those source code generators are themselves written in C or some other language requiring compilation. With Makefiles, whether with autotools or manually written ones, this is easy to set up with a few custom rules. With cmake, those generators need to be turned into standalone projects.


I really want to like CMake, but having spent the past 5 years using qmake daily, CMake seems very voodoo-ish and archaic.


Too bad, Qt project itself is moving to CMake as its main build tool: https://bugreports.qt.io/browse/QTBUG-73351


I'm aware.


> I'm compiling some 100k lines of legacy C (for a custom operating system) plus compat shims that make the thing run on Linux and my very very trivial build script (plain posix compatible shell) takes less than 5 seconds to run in debug mode.

That's why /you/ don't need make. But there are plenty of 100k line projects that would take much longer, like nearly anything written in c++ for example.


I knew someone was going to say C++.

Thing is, I hardly ever see C++ projects today use plain make. If make is a part of the build procedure, its files are generated by another set of tooling.

Again: I hardly ever bump into big projects that use plain make. All the projects I see use something more complicated, or are small enough that they'd be fine with a plain shell script.

Urgently needing the dependency graph that make provides, but little else, seems like a very niche set of requirements.


I like using make in my own small (<1k LoC) personal projects.

Usually, I have exclusively a few .PHONY targets that run a few lines of shell to init/clean/build.

I get all the same advantages as shell scripts, except I have less clutter in my repo and get tab autocompletion for free.

Make is a great tool for my use case. It's possible to waste tons of time fiddling with it. I've done it. It's not a necessary consequence of using Make, though; in my case, it was a consequence of using features of Make that I didn't need.


> Make is a great tool for my use case.

I'm sure it is, but I would suggest also looking into possible alternatives like "redo" and "tup". (If nothing else then for your own edumafacation.)

Anyway... this does not change the fact that make is fundamentally not suited to handle the non-trivial build problems in the modern age. It just does not work because it's built on the idea of files when describing dependencies. Almost no modern programming language works exclusively on a file level -- C and C++ were the major holdouts here, but C++ is moving towards a module-based compilation model... for lots of good reasons.

(Of course, make can theoretically be made to work for any arbitrary scenario, but ITT we've already seen quite a few incredibly awful/ugly hacks been posted to solve problems... that shouldn't have been problems in the first place.)


Other replies have mentioned C++. Personally, I think 5 seconds is a lot! I can live with rebuilds taking that long, but if I can make them take 1 second or half a second instead, I’ll be much happier.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: