C is so much worse than that. Many people declare symbols using macros for various reasons, so you end up with things like DEFINE_FUNCTION(foo) {. In order to get a complete list of symbols you need to preprocess it, this requires knowing what the compiler flags are. Nobody really knows what their compiler flags are because they are hidden between multiple levels of indirection and a variety of build systems.
> C is so much worse than that. Many people declare symbols using macros for various reasons, so you end up with things like DEFINE_FUNCTION(foo) {.
That’s not really C; that’s a C-based DSL. The same problem exists with Lisp, except even worse, since its preprocessor is much more powerful, and hence encourages DSL-creation much more than C does. But in fact, it can happen with any language - even if a language lacks any built-in processor or macro facility, you can always build a custom one, or use a general purpose macro processor such as M4.
If you are creating a DSL, you need to create custom tooling to go along with it - ideal scenario, your tools are so customisable that supporting a DSL is more about configuration than coding something from scratch.
If your Lisp macro starts with a symbol whose name begins with def, and the next symbol is a name, then good old Exuberant Ctags will index it, and you get jump to definition.
Not so with DEFINE_FUNCTION(foo) {, I think.
$ cat > foo.lisp
(define-musical-scale g)
$ ctags foo.lisp
$ grep scale tags
g foo.lisp /^(define-musical-scale g)$/;" f
Exuberant Ctags is not even a tool from the Lisp culture. I suspect it is mostly shunned by Lisp programmers. Except maybe for the Emacs one, which is different. (Same ctags command name, completely different software and tag file format.)
Other languages have preprocessors or macro facilities too.
C's is very weak. Languages with more powerful preprocessors/macros than C's include many Lisp dialects, Rust, and PL/I. If you think everyone using a weak preprocessor is bad, wait until you see what people will do when you give them a powerful one.
Microfocus COBOL has an API for writing custom COBOL preprocessors in COBOL (the Integrated Preprocessor Interface). (Or some other language, if you insist.) I bet there are some bizarre abominations hidden in the bowels of various enterprises based on that ("our business doesn't just run on COBOL, it runs on our own custom dialect of COBOL!")
c's macro system is weak on purpose, based on, i suspect, bad experiences with m6 and m4. i think they thought it was easier to debug things like ratfor, tmg, lex, and (much later) protoc, which generate code in a more imperative paradigm for which their existing debugging approaches worked
i can't say i think they were wholly wrong; paging through compiler error messages is not my favorite part of c++ templates. but i have a certain amount of affection for what used to be called gasp, the gas macro system, which i've programmed for example to compute jump offsets for compiling a custom bytecode. and i think m4 is really a pathological case; most hairy macro systems aren't even 10% as bad as m4, due to a combination of several tempting but wrong design decisions. lots of trauma resulted
so when they got a do-over they eliminated the preprocessor entirely in golang, and compensated with reflection, which makes debugging easier rather than harder
probably old hat to you, but i just learned last month how to use x-macros in the c preprocessor to automatically generate serialization and deserialization code for record types (speaking of cobol): http://canonical.org/~kragen/sw/dev3/binmsg_cpp.c (aha, i see you're linking to a page that documents it)
C's is weak yet not weak – you can do various advanced things (like conditional expansion or iteration), but using esoteric voodoo with extreme performance cost. Whereas other preprocessors let you do that using builtins which are fast and easy to grok.
Poor C preprocessor performance has a negative real world impact, for example recently with the Linux kernel – https://lwn.net/Articles/983965/ – a more powerful preprocessor would enable people to do those things they are doing anyway much more cheaply
I've always suspected the powerful macro facilities in Lisp are why it's never been very common - the ability to do proper macros means all the very smart programmers create code that has to be read like a maths paper. It's too bespoke to the problem domain and too tempting to make it short rather than understandable.
I like Rust (tho I have not yet programmed in it) but I think if people get too into macro generated code, there is a risk there to its uptake.
It's hard for smart programmers to really believe this, but the old "if you write your code as cleverly as possible, you will not be able to debug it" is a useful warning.
Yes, the usefulness of macros always has to be balanced against their cost. I know of only one codebase that does this particular thing though, Emacs. It is used to define Lisp functions that are implemented in C.
It's a common pattern for just about any binding of C-implementation to a higher-level language. Python has a similar pattern, and I once had to re-invent it from scratch (not knowing any of this) for a game engine.