The thing that has really kept me from getting behind updates to the C++ universe is the lack of progress on improving the state of build tooling. It is miserably underengineered for modern, dependency-heavy environments. C++20 does introduce modules, which is a good push towards correcting the problem, but I'm still going to be "wait and see" on whether the actual implementation pans out.
Well, there's Conan, which helps a bit, but these days what I simply do is use CMake to download a package from GitHub or wherever and build it.
Sadly the C++ ABIs are standardized the way that C ABIs are (I'm OK with why but it's unfortunate in that it creates a barrier) so you have to have separate libraries compiled with g++ and clang++ if you use both on your platform (we use both because they catch different bugs, and for that matter exhibit different bugs). But it means you can't simply install, say, fmt in any system-wide directory like /usr/lib or /usr/local/lib
Just as an amusing side note: Common Lisp used to be criticized for the massive size of its libraries and later likewise C++. It was true they were quite large. Now both are criticized for their tiny libraries. Which by today's standards they are.
you can definitely use c++ libraries compiled with clang++ when your code is compiled with g++ and vice versa. It only gets funky when one uses libc++ and one libstdc++
No, GCC and clang are fully ABI compatible (modulo bugs of course). Libstc++ and libc++ are not, so use whatever is the standard library of your platform (i.e the default for calng and GCC) and things work fine.
Yes, gcc and clang use the same platform API so if they use the same headers (libstc++ or libc++) then they will indeed use identical structure layout etc.
I meant a "gcc-native" toolchain (gcc + libstdc++) vs "llvm native" (clang++ + libc++) having different layout (and there is even some interoperability between them thanks to work by the llvm team). I realize my need to do this (to try to minimize opting bugs) is a special case, and probably unusual.
There is not really anything more “native” about using libc++ with clang as opposed to libstdc++ other than the fact that they happened to be developed by the same project. Using clang with libstdc++ is extremely mainstream and normal.
Actually I would bet that even among clang users, libstdc++ is used more commonly on GNU/Linux (IDK for sure, but it’s my hunch).
> Just as an amusing side note: Common Lisp used to be criticized for the massive size of its libraries and later likewise C++.
Part of "size of libraries" is "mental size of libraries".
And C++ and Lisp and have very large mental spaces for their main core libraries. A "String", for example, carries a huge amount of mental baggage in those languages. In most other languages, a string is extremely straightforward because it was designed into the language from the start.
It's possible I'm just used to it, but I've never found std::string more complicated than, say, python (what's up with unicode in 2 vs 3?) or JavaScript (UTF-16 = surrogate pair pain).
It's essentially a std::vector<char> with a few random convenience features bolted on.
I guess some of the confusing points are: not unicode aware, string literals aren't std::strings by default, c_str returns a pointer to a buffer with length one greater than the string length, and the usual C++ quirks like why is there both data and c_str?
One of the things I like about C++ is that there is the std::string for common uses, but then you can design your own string classes with defined conversions to std::string. Qt adds QString with lots of nice utility methods, UnrealEngine adds optimized string types, etc. So you can have custom tailored classes for the task at hand, but easy to convert around to the different types with std::string conversions defined.
One of the things I dislike about C++ is that any large project will have lots of code converting between a dozen custom and gratuitously different string types.
This is the #1 thing I love about C++ compared with Rust — I don’t want it to be easy to depend on thousands of things. I would rather use a small, curated, relatively standard set of libraries provided by my OS vendor or trusted third parties.
“Modern, dependency-heavy environments” are a symptom of the fact that certain ecosystems make it easy to get addicted to dependencies; I don’t think they’re a goal to strive towards.
That's throwing the baby out with the bathwater. Building a C++ project with even a few critical dependencies (e.g. graphics libraries, font libraries, a client for a database or other complex network protocol) is a massive hassle. Sometimes those are curated well, sometimes they're not--but they remain essential for many projects. By taking a hard line against the hypothetical explosion of low-quality dependencies, you compromise the ability to use even the "small, curated, relatively standard set" of dependencies that are genuinely essential.
It's not about the ease of use (you need just a couple of lines in CMake to trigger the pkg-config scripts that every lib in the distribution has). It's about the people who work hard on maintaining the distributions. That's where the quality comes from.
Yes Node is a nightmare, however, you don't need to use public repositories, that much is a choice.
Similarly, Cargo in Rust is an absolute dream to work with, you just run `cargo new <project>` and then `cargo build`. That's it. If you want to add a dependency you can do so from the public crates.io repository or a sub-path or a git repository.
No language should be without this core feature, so it'd be great to see C++ get it too.
That's not the same thing at all. Different versions, sources, etc. Should they really be he job of the OS package manager even when statically linked?
What does this feature have to do with programming languages? Why do I need one dependency tracking tool for Rust, a separate one for Go, a separate one for JavaScript, etc?
I already have a dependency management tool for C++; it is called the Ubuntu package repositories. I don’t need another one baked into the language.
Ubuntu and other OS package managers are years out of date and they are at the wrong layer for serious projects because they are OS layer so not good for hermetic builds on build servers.
OS distributions have package managers and package repositories that have maintainers who are mostly decoupled from the developers. So that takes care of the quality/security problems that arise in ecosystems like Node.js.
There is also C. The tooling and the "package manager for C++" would be expected to seamlessly work for C and be expected to be accepted and used by the C community.
Although I agree with your point, cmake+vcpkg goes a long way for hobby projects, cmake with a bit of custom scripting goes a long way for larger scale projects.
The cmake language might not be beautiful, but it does allow for simple sub-projects/library scripts once the main cmake script is set up properly.
> build tooling...is miserably underengineered for modern, dependency-heavy environments.
My advice about build tooling is to use buck (from Facebook) or bazel (from Google). If you have an ex-Googler nearby, use Bazel. If you have an ex-Facebooker nearby, use buck. Otherwise flip a coin.
No other installation necessary. If someone has a C++ compiler and Bazel installed that will allow you to build against Boost deps (e.g. "@boost//:callable_traits"). No downloading boost by hand, no building boost by hand, no managing it's build, bazel does all of that.
Bazel does have a dependency ecosystem, it's based off of hashed versions of publicly available git repos and http archives (and anything else you want really). Which means any code anywhere can be a dependency while still being reproducible. Additionally you can provide local build instructions for the external code, so you don't even need to rely on their build. The better way is to find someone who maintains a BUILD file for a repo (or mirror and maintain it yourself) but still.
Boost may have been a bad example, since it is a common dependency, without any sub-dependencies.
To beetwenty's point, the C++ ecosystem in general lacks the the level of support for dependency ecosystems that you find in Maven, pip, gem, npm, etc.
Bazel external dependencies are in fact a real pain point. See the 3+ year old Bazel Recursive WORKSPACE proposal. [1]
I was at the first Bazel Conf in 2017, and the external dependencies meeting was quite the row.
I feel like that's a misunderstanding of the issue. Recursive dependencies are orthogonal to the issue of having an ecosystem of dependencies. There is a pattern for recursive dependency management (did you notice the `boost_deps` recursive dependency call?) that is currently fine for production that recursive workspaces will make better.
Also as demonstrated Boost is way easier to include in bazel than make which was the original issue under discussion.
I make a library that depends on boost, here is how you use it:
The trick is that now one will depend on somebody who is not boost for these nice mappings. I brought such a dependency for gRPC once and now it doesn't work with bazel 1.0. I either have to find the author, understand all the bazel trickery, or switch to something else, because most of these bindings depend on many bazel features.
So, bringing such multiple third party bandaid is currently not such a great idea. It would be a bit better if boost itself provided it.
I have seen a library with a copy of boost in the include folder. Client code is forced to use this outdated version and must avoid transitive dependencies to boost.
Please don't do that.