Hacker News new | past | comments | ask | show | jobs | submit login

The thing that has really kept me from getting behind updates to the C++ universe is the lack of progress on improving the state of build tooling. It is miserably underengineered for modern, dependency-heavy environments. C++20 does introduce modules, which is a good push towards correcting the problem, but I'm still going to be "wait and see" on whether the actual implementation pans out.



Well, there's Conan, which helps a bit, but these days what I simply do is use CMake to download a package from GitHub or wherever and build it.

Sadly the C++ ABIs are standardized the way that C ABIs are (I'm OK with why but it's unfortunate in that it creates a barrier) so you have to have separate libraries compiled with g++ and clang++ if you use both on your platform (we use both because they catch different bugs, and for that matter exhibit different bugs). But it means you can't simply install, say, fmt in any system-wide directory like /usr/lib or /usr/local/lib

Just as an amusing side note: Common Lisp used to be criticized for the massive size of its libraries and later likewise C++. It was true they were quite large. Now both are criticized for their tiny libraries. Which by today's standards they are.


you can definitely use c++ libraries compiled with clang++ when your code is compiled with g++ and vice versa. It only gets funky when one uses libc++ and one libstdc++


...unless those libraries use standard library datastructures, as you point out.


No, GCC and clang are fully ABI compatible (modulo bugs of course). Libstc++ and libc++ are not, so use whatever is the standard library of your platform (i.e the default for calng and GCC) and things work fine.


Oh I see; we were speaking past each other.

Yes, gcc and clang use the same platform API so if they use the same headers (libstc++ or libc++) then they will indeed use identical structure layout etc.

I meant a "gcc-native" toolchain (gcc + libstdc++) vs "llvm native" (clang++ + libc++) having different layout (and there is even some interoperability between them thanks to work by the llvm team). I realize my need to do this (to try to minimize opting bugs) is a special case, and probably unusual.


There is not really anything more “native” about using libc++ with clang as opposed to libstdc++ other than the fact that they happened to be developed by the same project. Using clang with libstdc++ is extremely mainstream and normal.

Actually I would bet that even among clang users, libstdc++ is used more commonly on GNU/Linux (IDK for sure, but it’s my hunch).


parent says they wants to use libc++ to catch more bugs. Which is a reasonable use case.


> Just as an amusing side note: Common Lisp used to be criticized for the massive size of its libraries and later likewise C++.

Part of "size of libraries" is "mental size of libraries".

And C++ and Lisp and have very large mental spaces for their main core libraries. A "String", for example, carries a huge amount of mental baggage in those languages. In most other languages, a string is extremely straightforward because it was designed into the language from the start.


It's possible I'm just used to it, but I've never found std::string more complicated than, say, python (what's up with unicode in 2 vs 3?) or JavaScript (UTF-16 = surrogate pair pain).

It's essentially a std::vector<char> with a few random convenience features bolted on.

I guess some of the confusing points are: not unicode aware, string literals aren't std::strings by default, c_str returns a pointer to a buffer with length one greater than the string length, and the usual C++ quirks like why is there both data and c_str?


> the usual C++ quirks like why is there both data and c_str

The usual C++ response: for backwards compatibility, because data was not required to null-terminate prior to C++11.


>In most other languages, a string is extremely straightforward because it was designed into the language from the start.

I think one of the classic advantages of C++ over C is that you have the option of std::string instead of char arrays.

I don't program in C++ so I don't really know but I do a lot of pure C and the strings truly are a mess (despite being VERY straightforward)


One of the things I like about C++ is that there is the std::string for common uses, but then you can design your own string classes with defined conversions to std::string. Qt adds QString with lots of nice utility methods, UnrealEngine adds optimized string types, etc. So you can have custom tailored classes for the task at hand, but easy to convert around to the different types with std::string conversions defined.


One of the things I dislike about C++ is that any large project will have lots of code converting between a dozen custom and gratuitously different string types.


This is the #1 thing I love about C++ compared with Rust — I don’t want it to be easy to depend on thousands of things. I would rather use a small, curated, relatively standard set of libraries provided by my OS vendor or trusted third parties.

“Modern, dependency-heavy environments” are a symptom of the fact that certain ecosystems make it easy to get addicted to dependencies; I don’t think they’re a goal to strive towards.


That's throwing the baby out with the bathwater. Building a C++ project with even a few critical dependencies (e.g. graphics libraries, font libraries, a client for a database or other complex network protocol) is a massive hassle. Sometimes those are curated well, sometimes they're not--but they remain essential for many projects. By taking a hard line against the hypothetical explosion of low-quality dependencies, you compromise the ability to use even the "small, curated, relatively standard set" of dependencies that are genuinely essential.


It's not about the ease of use (you need just a couple of lines in CMake to trigger the pkg-config scripts that every lib in the distribution has). It's about the people who work hard on maintaining the distributions. That's where the quality comes from.

And not only for the libraries:

http://kmkeen.com/maintainers-matter/


I'm not entirely convinced that this is a bad thing. The dependency-heavy environments a la Node.js gave us some interesting security nightmares.


Yes Node is a nightmare, however, you don't need to use public repositories, that much is a choice.

Similarly, Cargo in Rust is an absolute dream to work with, you just run `cargo new <project>` and then `cargo build`. That's it. If you want to add a dependency you can do so from the public crates.io repository or a sub-path or a git repository.

No language should be without this core feature, so it'd be great to see C++ get it too.


Package managers of the OS distributions just work, so many people don't see it as a "core feature" for the language.


That's not the same thing at all. Different versions, sources, etc. Should they really be he job of the OS package manager even when statically linked?


When statically linked it's even simpler: the dependency is used only during the build.


Maybe I wasn't clear, I'm advocating a solid build system and build dependency manager should be part of the core.


What does this feature have to do with programming languages? Why do I need one dependency tracking tool for Rust, a separate one for Go, a separate one for JavaScript, etc?

I already have a dependency management tool for C++; it is called the Ubuntu package repositories. I don’t need another one baked into the language.


Ubuntu and other OS package managers are years out of date and they are at the wrong layer for serious projects because they are OS layer so not good for hermetic builds on build servers.

https://repl.it/site/blog/packager


> No language should be without this core feature, so it'd be great to see C++ get it too.

BTW modules won't address this. I'm not sure you did but some other down thread comments implied that some people thought it would.


OS distributions have package managers and package repositories that have maintainers who are mostly decoupled from the developers. So that takes care of the quality/security problems that arise in ecosystems like Node.js.

There is also C. The tooling and the "package manager for C++" would be expected to seamlessly work for C and be expected to be accepted and used by the C community.

(personally I use CMake + OS package manager)


Although I agree with your point, cmake+vcpkg goes a long way for hobby projects, cmake with a bit of custom scripting goes a long way for larger scale projects.

The cmake language might not be beautiful, but it does allow for simple sub-projects/library scripts once the main cmake script is set up properly.


I think the general thinking is that in the c/c++ world dependency management is the role of the build tool

This is currently really easy to do cleanly though CMake by using Hunter and toolchain files (don't use submodules or addexternalproject)

External tools like Conan are also unnecessary bc they introduce redundancy and extra maintenance


> build tooling...is miserably underengineered for modern, dependency-heavy environments.

My advice about build tooling is to use buck (from Facebook) or bazel (from Google). If you have an ex-Googler nearby, use Bazel. If you have an ex-Facebooker nearby, use buck. Otherwise flip a coin.


While I love these projects, neither one of these actually fix the problem of an ecosystem of dependencies.

Buck and Bazel were created for complex internal dependencies.

You'll have just as easy/hard of a time getting Boost installed with Bazel as you would for Make.

Or publishing an accessible library that depends on Boost.


> You'll have just as easy/hard of a time getting Boost installed with Bazel as you would for Make.

> Or publishing an accessible library that depends on Boost.

This is patently false. Here is the bazel dependency for boost (put this in your workspace file):

    git_repository(
        name = "com_github_nelhage_rules_boost",
        commit = "6d6fd834281cb8f8e758dd9ad76df86304bf1869",
        shallow_since = "1543903644 -0800",
        remote = "https://github.com/nelhage/rules_boost",
    )

    load("@com_github_nelhage_rules_boost//:boost/boost.bzl", "boost_deps")
    boost_deps()
No other installation necessary. If someone has a C++ compiler and Bazel installed that will allow you to build against Boost deps (e.g. "@boost//:callable_traits"). No downloading boost by hand, no building boost by hand, no managing it's build, bazel does all of that.

Bazel does have a dependency ecosystem, it's based off of hashed versions of publicly available git repos and http archives (and anything else you want really). Which means any code anywhere can be a dependency while still being reproducible. Additionally you can provide local build instructions for the external code, so you don't even need to rely on their build. The better way is to find someone who maintains a BUILD file for a repo (or mirror and maintain it yourself) but still.


Boost may have been a bad example, since it is a common dependency, without any sub-dependencies.

To beetwenty's point, the C++ ecosystem in general lacks the the level of support for dependency ecosystems that you find in Maven, pip, gem, npm, etc.

Bazel external dependencies are in fact a real pain point. See the 3+ year old Bazel Recursive WORKSPACE proposal. [1]

I was at the first Bazel Conf in 2017, and the external dependencies meeting was quite the row.

[1] https://bazel.build/designs/2016/09/19/recursive-ws-parsing....


I feel like that's a misunderstanding of the issue. Recursive dependencies are orthogonal to the issue of having an ecosystem of dependencies. There is a pattern for recursive dependency management (did you notice the `boost_deps` recursive dependency call?) that is currently fine for production that recursive workspaces will make better.

Also as demonstrated Boost is way easier to include in bazel than make which was the original issue under discussion.

I make a library that depends on boost, here is how you use it:

    git_repository(
        name="foo"
        ...
    )

    load("@foo//:foo/foo.bzl", "foo_deps")
    foo_deps()
magic


The trick is that now one will depend on somebody who is not boost for these nice mappings. I brought such a dependency for gRPC once and now it doesn't work with bazel 1.0. I either have to find the author, understand all the bazel trickery, or switch to something else, because most of these bindings depend on many bazel features.

So, bringing such multiple third party bandaid is currently not such a great idea. It would be a bit better if boost itself provided it.


Wasnt this because the dependency for gRPC was moved into bazel itself?

How is this different than most other build systems? You always have to rely on others.


What kinds of problems do you run into with Boost on Bazel? Are there no available build files you can just drop and use directly?


Ah. Yes, I wasn't the one who had to get Boost into Buck, so I don't know the pain. But I also have been told to avoid Boost for modern C++, so...


I need boost, but mainly because I use other packages than depend on boost.

I also use boost when the compiler is behind (e.g. boost::variant until apple's compiler caught up).

I can't wait to have a boost-free tree. but I suspect it will be a while.


Why would someone avoid Boost?


I have seen a library with a copy of boost in the include folder. Client code is forced to use this outdated version and must avoid transitive dependencies to boost. Please don't do that.


It’s a very heavy dependency, in many ways.


Look into Bazel, so far it has been a dream.


What's the specific problem? Big companies run 100K+ .so/.o systems.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: