Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
C++17 and other future highlights of C++ (meetingcpp.com)
129 points by meetingcpp on March 10, 2016 | hide | past | favorite | 47 comments


Looks like C++17, like C++14, is just going to be a spit shine over C++11.

Concepts, modules and coroutines were perhaps the three most anticipated features, and they failed to deliver all three. No STL2 either. Woeful. The field is now wide open for the next 5 years for Rust to come along and eat C++s lunch.

The whole TS/std::experimental idea is just a folly. As a C++ programmer I just don't care if a library is available via a TS vs Boost. There's literally zero advantage to a TS. None. I'd rather stick with Boost, because then at least I only have one implementation to test against, instead of three, each with their own bugs, omissions and performance characteristics.

I've even taken to using boost::container::string where I can, instead of std::string, because the libc++ and stdlibc++ implementations have different performance characteristics.


> Concepts, modules and coroutines were perhaps the three most anticipated features

Were they? They're great and all but from a practical standpoint I'm most excited about networking being baked in. I do almost nothing nowadays that doesn't require some form of networking. Now I can do it without bringing in any extra dependencies beyond the STL? And it's portable? Yes please!

I'm not a huge C++ developer. Maybe if I was other things would be more important to me. But adding in networking makes me want to use the language again.


Networking is a very particular set of features that you only need if you are, well, doing something with the network. I'm not and probably never will, for example. But concepts and modules (and perhaps coroutines) were general features that would have potentially benefitted every single problem domain, since they are fundamental to how you use the language, not what you are trying to do with it.


Doing networking in the standard library without introducing a proper async framework is just more madness. Polishing off promises and futures, and getting coroutines nailed down, would have made introducing an intuitive network library a doddle.

Instead we'll get something carved out of ASIO, which is a complete mess full of gross unintuitive APIs and callbacks.


What's wrong with ASIO interface? My only issue with it is that once a socket is bound to an io_service it can't be easily rebound to another one.

For many applications where C++ is used, promises and futures introduce unnecessary overhead. On the other hand an ASIO-like interface works just fine.

Additionally, ASIO, and by extension the network proposal, have transparent opt-in support for futures via a special callback type.


Example terrible API: You have to create a socket for a new connection before calling async_accept. Try using this with coroutines.

> opt-in support for futures via a special callback type.

Which is ugly. The coroutines opt-in as well. Coroutines actually have the potential to be faster than callbacks.


Fair enough. But honestly the lack of modules never kept me from doing C++. It was more the lack of a modern standard library. Pretty much every modern language has a way of conducting networking with their standard library.

Though you have a great point. Really miss using C++ and while it still has tons of awesome ideas / stuff parts of it are just really far behind many other languages. I'd love to see modules come in and someone create a good package manager that uses them.


If you are talking about networking TS based on boost:asio, it is much more than just networking. It is a full fledged multi threading library.


At the places that I have worked the compile (and link time, even with using something like Incredibuild) time is a huge factor. Being able to have modules would have improved this significantly.

Coroutines would also have been useful in many circumstances.


Well, there is progress on all features, yet nothing is ready to go into C++17. So either we get no new standard at all, or we get a Standard without the large features.

Also C++14 was a step forward, generic lambdas are a step forward (boost::hana is based on this, and brings though compile times down), I expect similar language features in C++17, that ease the programming and will bring new innovations to C++.

Your boost argument is only valid for library features. Modules or concepts can never be a part of boost in the way they are in the TS.


> So either we get no new standard at all, or we get a Standard without the large features.

No new standard then. The C++ compilers, standard libraries and the ecosystem have barely caught up with C++11, and C++ 17 was now on the horizon. If it doesn't bring much benefit, then it really cannot justify the compatibility nightmares. I for one will skip it.

> Your boost argument is only valid for library features.

That is exactly the point. Library features are not exciting as they can be implemented just fine and better by third parties. Therefore a C++ standard with only library features are hardly useful at all.


> The C++ compilers, standard libraries and the ecosystem have barely caught up with C++11

That's not really true. Both GCC and Clang have reached C++11 and C++14 feature-complete status very quickly.


The latest versions, yes. But how many production system use the latest compilers?


For me concepts are a joke. They have been in development, like, forever (heck, even perl6 project got started and was released in the meantime!), they are hugely polarizing for the committee and the supposed benefits are, well, lofty. In the meantime everybody just got better at spotting common patterns in template error messages. Is there a point beyond which everybody will essentially give up or is it a point of honor to bring them into the standard?


I agree. In particular, I'm extremely disappointed about their failure to deliver concepts.


Keep in mind that the C++ committee is made of up a group of volunteers and is an entirely non-profit organization. Membership participation actually comes at a great personal cost of the individuals. When I see language like "failure to deliver" it makes me wonder if the committee process is being fairly perceived by the larger overall user-base.

As an aside, anyone can join the committee if they really wanted to see a particular language feature see the light of day in a certain way. It's not as hard as you might think. All it takes is a bit of hard work on any sufficiently motivated individual's part.

There's a great discussion of the process here: https://www.youtube.com/watch?v=PqU_ot4BlNQ


The language is just too complicated at this point. Any time you want to add something you need to ensure it plays well with everything else. It is a huge amount of friction.


As is Bjarne.


I'd say I could do without concepts, modules and coroutines but ranges ! Ranges were so nice and would finally allow for easier stream handling.


Whats holding you back then?

Eric Nieblers Range-v3 library is available:

https://github.com/ericniebler/range-v3/

Its not going to be (much) better in the standard...

Except that it would use real concepts then...


Sure, if you ignore statements (from the author!) like:

> Check out the (woefully incomplete) documentation here.

and

> No promise is made about support or long-term stability. This code will evolve without regard to backwards compatibility.

That's not to say I blame him, but you really shouldn't be recommending range-v3 for general use.


Good realization. Now take it one step forward, and realize that C++11 was just a "spit shine" over C++98 (although spit isn't the bodily fluid I'd use to describe it). It's all part of the marketing machine to sell an ancient programming language riddled with security holes and WTFs. And it's working pretty well, unfortunately... the Donald Trump of programming languages.


Special functions?[1] Why do they keep wanting to add that to the standard? They've been trying to get special functions into the standard since C++11 (née C++0x). That seems like such a very niche use, better left for specialised libraries where numerical researchers can work out the very best algorithms rather than for a standard system library. I mean, compared to accessing a filesystem or parallel processing, how many C++ users regularly need the Riemann zeta function or a modified Bessel function of the second kind?

---

[1] https://en.wikipedia.org/wiki/Special_functions


> ...how many C++ users regularly need the Riemann zeta function or a modified Bessel function of the second kind?

I've wondered the same thing myself. They're probably easier to standardize since the specification for that behavior is already done by mathematicians. And the implementation is generally re-entrant so the possibility of show-stopper design bugs is minimal.

Getting concepts wrong, in contrast, has a much larger downside.

Not saying special maths should be standardized. Just attempting to describe the phenomenon.


Maybe because the need for them has popped up enough times and they tought it would be a good idea to add them to the standard? Also that addition wouldn't hurt no one.


> Also that addition wouldn't hurt no one.

Library implementors would have to implement it. Most of them are probably unqualified to do it, so they'll probably just grab Boost's implementation. And Boost's implementation would be unreadable to most implementors. Thus everyone would get a slightly subpar implementation, because most numerical researchers are not keeping an eye on Boost and making sure it has the best methods available. The end result is that those in the know will avoid the C++ stdlib, just like Qt did and reimplemented strings and maps.

We've seen this problem before with valarray and with export. There is a problem with standardising something that nobody wants or nobody can implement correctly.


The flip side is that I would rather use a “slightly subpar” implementation than having to roll my own from a Google search and some smart guessing.

For example, if you want to implement a Student's t-test, you soon will run into the problem that you want to have a Beta function or the gamma function.

One could probably copy-paste a somewhat working version together from search engines, add asserts to prevent me from ever calling them with arguments out of the range where it seems to work, but for most of us, the result likely will be slower and less precise than a version that gcc, clang, or commercial compilers would provide.

And yes, I could probably buy an implementation somewhere, but in the real world, that often isn’t a real option, and even then, I wouldn’t know how well it worked, either.


Wouldn't it be pretty easy to compare a re-implementation of a special function against a reference boost implementation? For example, there are only 232 floats and for a given function you can check them all very quickly. It's a little trickier for doubles, but one can imagine doing, say, 240 tests and getting a reasonable level of confidence.


Well, they have been accepted into the standard last week ;)


Before they heap yet more into the language, I think they need to go back to previously-standardized classes and seriously critique them, ideally replacing some of them with entirely new designs.

For instance, it is crazy that something as critical and commonplace as "std::iostream" is riddled with error-prone design decisions. The "<<" operator choice is wrong, and it fails basic localization requirements before you even get into its other problems. To this day, I see people who don’t handle stream errors correctly in every case (and who can blame them, when you have stuff like “bad() is actually not the exact opposite of good()”?). Don’t even get me started on the fact that most flags have two different pairs of constants with similar names, used in different contexts, with different values, that will even compile after the wrong one has been chosen (producing who-knows-what behavior?). Or the fact that global stream states can be screwed-with from anywhere, meaning that when a library call returns you can’t be sure that your own "std::cerr" state is “undamaged”? And this is just from streams! The list goes on.


Does it feel to anyone else like they are procrastinating on modules? C++ has been thinking about it for a very long time. Yet the actual result is a generic "soon."


I've been following modules pretty closely and attended the meeting in Jacksonville. I think the story here is actually pretty positive. I understand people are upset it's not making C++17 but those expectations were always unrealistic. Based on my read of the situation, modules were never a serious candidate for C++17.

IMO, what's happening with modules is basically the best (realistic) outcome. Up until Jacksonville, we had only a proposal and all the implementors either implementing their proposal, or implementing nothing waiting for a TS. Modules were blocked on the committee. And that's a slow place to be blocked.

Now we've cleared the major hurdles for a modules v1 TS and gotten to the point where implementors are unblocked by the committee and we can start actually building and using it. If the early-adopters want modules from their implementations, they'll actually be able get it, now, finally.

Once implementations start happening, and we gain a bunch of "deployment experience", it'll be a whole lot easier to chart a course through the remaining thorny issues with respect to macros and "middle-up" deployment to existing codebases.


Just speaking as an average C++ programmer who doesn't watch the standards process too closely, after C++14 it seemed like quite a few major features, modules included, were expected for C++17. At that point no one was saying it was unrealistic.

I was hoping for reflection, but it was clear to me a few years ago that there wasn't enough interest or agreement there. But I think a lot of people expected modules and concepts.


I've been hearing very consistently that there hasn't been much consensus on what a 'module' is or what it should do.


I'm finding it hard to keep up with all these new features!

In C++11, I think that the most overlooked feature is attributes. Finally, a way of potentially doing AOP without resorting to hacks :-)


Always looking for yet another reason to sustain my hope of avoiding C++ forevermore, I like the argument-reversal against using other languages instead. "Most devs don't know it, they have to retrain!"

I learned C++98 pretty deeply (better than it deserved) but apart from a few things like "hey cool you guys sort of have lambdas now and auto and some new pointer types I'll learn if against my hopes I ever have to do anything important" I haven't really kept up much, but I know there's a lot that has been added. My friend (who just knows a little C++98, his main languages are higher level things for writing automation tools) was talking to me a few weeks ago about finding a bug in his boss's code and having a moment of "Nope, let's wait for him to come back tomorrow" after seeing the totally unfamiliar rvalue reference && in a constructor he tracked down as the source of the bug... If I were to become involved in any C++ project again, if it wasn't strictly maintenance work I'd feel very uncomfortable doing anything without first taking a chunk of time to learn everything new. I feel like that chunk of time would better be spent on Rust, and if I were a manager looking to build something brand new that needed to be low level, I'd also be heavily looking at Rust and hiring people who know it or people who want to learn it, vs. going with C++ and hoping that people that put C++ on their resumes actually understand the modern details on a more than academic level (which would be hard to test for in interviews) and if they don't shelling out the same cost to train them as that of training newcomers to Rust. As for what motivates me out of the office as a true C++ replacement so that companies don't have to train me on their dime, Nim is much more fun.


> SG14 - Games & Low Latency

Can't help but think Jonathan Blow's new language designed for games will blow C++ out of the water, and will probably arrive much sooner than these long drawn-out C++ committee features.


I recently had to do some C++ again, and the amount of weird pitfalls and annoying corner cases is really starting to get to me. For example, std::cout << std::hex << myNumber prints the number in hex as expected. Well, if the number has 16 bits or more. Eight-bit integers are an alias of char, and std::hex does wrong things if you feed it a char.

There probably is a workaround for this, as there is a workaround for every other "feature" too, but the more seldom I write C++, the more I realize the language could really use a streamlining. Programming languages are weird, as they seem to develop more, not less, rough patches and glitches as they age.


This article really should include links to the previous entries in the same series. I hoped the 'Back' link would help guide me to them, but it just runs

  history.go(-1)


The very first link is a link to the previous part of the series.


Oh, so it is. I thought it was a link to the current Technical Specifications, most likely to move into C++ after C++17.


Yawn. Tell me when they standardize 2,3,and 4 element vector types so I don't need to use intrinsics.


I once proposed a library for 2, 3 and 4 element vectors based on the one in Graphic Gems.[1] It was rejected because it didn't use templates.

[1] http://www.animats.com/source/graphics/algebra3.h


The nice thing about the intrinsics is that they can be passed by value or as a return type. They map nicely to modern vector hardware. Anything done with templates is going to be very generic and will not have these characteristics. But let them continue adding abstractions with funky syntax instead of optimized versions of widely used constructs. ;-)


I don't see why template-based types would be able to be passed by value or returned?

An interesting library is Boost::SIMD (!! not part of Boost !!): https://github.com/NumScale/boost.simd


They're inlines, not intrinsics. Intrinsics would be built into the language. Not a bad idea for basic constructs such as short vectors of floating point numbers, which are very close to the hardware.

I was writing a physics engine at the time, and it was a huge hassle that this wasn't standardized in C++. I used three libraries which each had their own definition of short vectors. Way too much conversion.

Matlab has one notation for arrays and matrices. So all Matlib libraries are compatible at the array and matrix level. This is one reason that much number-crunching is done in Matlab.

Rust already has at least three libraries for short vectors.


I think they should stop C++ standard updates for a five year period during which we can learn from the recent changes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: