This is becoming such a tiresome opinion. How are concepts fixing a problem created by previous features to the langue? What about ranges? Auto? Move semantics? Coroutines? Constexpr? Consteval? It is time for this narrative to stop.
Move semantics is only needed because C++ introduced implicit copies (copy constructor) and they of course fucked it up my making them non-destructive so they aren't even 'zero cost'.
Constexpr and consteval are hacks that 1) should have just been the default, and 2) shouldn't even be on the function definition, it should instead have been a keyword on the usage site: (and just use const)
int f() { ... } // any old regular function
const int x = f(); // this is always get evaluated at compile time, (or if it can't, then fail to compile)
int y = f(); // this is evaulated at runtime
That would be the sane way to do compile time functions.
I agree that I would have preferred destructive moves, but move semantics makes C++ a much richer and better language. I kinda think pre-move semantics, C++ didn't quite make "sense" as a systems programming language. Move semantics really tied the room together.
const int x = f(); // this is always get evaluated at compile time, (or if it can't, then fail to compile)
That's very silly. You're saying this should fail to compile?
void foo(int x) {
const int y = bar(x);
}
There's no way the compiler can run that, because it doesn't know what x is (indeed, it would have a different value every time you run the function with a new argument). So your proposal would ditch const completely except in the constexpr case, everything runtime would have to be mutable.
So you respond "well, I didn't mean THAT kind of const, you should have a different word for compile-time constants and run-time non-mutability!" Congratulations, you just invented constexpr.
There are many bad things about C++, but constexpr ain't one of them.
>There's no way the compiler can run that, because it doesn't know what x is (indeed, it would have a different value every time you run the function with a new argument). So your proposal would ditch const completely except in the constexpr case, everything runtime would have to be mutable.
Yeah, I see no problem with that.
Non-constant expressions usage of 'const' has always just seemed like a waste of time for me, never found it useful.
But I guess a lot of people really liking typing const and "preventing themselves from accidentally mutating a variable" (when has that ever happened?), so as a compromise I guess you can have a new keyword to force constant expressions:
constexpr auto x = foo(); // always eval at compile time
const auto x = foo(); // old timey const, probably runtime but maybe got constant folded.
but it's not really a big deal what they keyword is, the main point was that "give me a constant value" should be at the usage site, not at the function definition.
> the main point was that "give me a constant value" should be at the usage site, not at the function definition.
The issue is, not everything can be done at compile time, and so “I can use this at compile time” becomes part of the signature because you want to ensure that it will continue to be able to be used that way. Without it, changes in your function could easily break your callers.
Exactly right. There's a huge benefit to encode the ability for compile-time evaluation in the signature of the function itself. Much better than doing it "ad-hoc", like how template instantiation does it. Sometimes it will work, sometimes it doesn't. constexpr functions always work.
I like const because I can look at a function signature and know that nothing downstream is mutating a parameter. I can also write a function that returns a const reference to a member and know that nobody in the future will ever break my invariants.
This isn't about "oops, I didn't mean to mutate that." This is about rapidly being able to reason about the correctness of some code that is leveraging const-qualified code.
I kinda like it on occasion. Works like pythons defaultdict. Like, if you wanna count something:
for (const auto &thing: collection) {
counts[thing]++;
}
Works nicely, you don't have to check if it's already there before ++ing it. As long as you know that's what operator[] does, it comes in handy more than I would've expected.
Yeah. It has its uses. You could accomplish the same with the rather verbose `counts.try_emplace(thing).first->second++` but nobody wants to type that (even if it is more explicit about what it's doing).
Another popular use case is something along the lines of:
That said, I don't know what behavior I'd want if maps didn't automatically insert an element when the key was absent. UB (as with vector)? Throw an exception? Report the incident to Stroustrop? All these options feel differently bad.
Maybe it's not a concern in C-family languages, but rust's culture of defaulting to let and only using mut when it's specifically required does feel very pleasant and ergonomic when I'm in that headspace.
Eh not really accurate because C's const means immutable not actually constant. So I get introducing constexpr to actually mean constant. But, yeah, constexpr x = f() should probably have worked as you described.
const is different in C++ from const in C. const variables in C++ are proper compile-time constants. In C they are not (the nearest equivalents are #define and enum values).
So in C++ "const x = EXPR" would make sense to request compile-time evaluation, but in C it wouldn't.
Ouch, but thanks. I learned something today - something I'd long forgotten. I like your example, it shows the point well. (Though, there are circumstances when a compiler can unroll such a loop and infer a compile-time constant, it wouldn't qualify as a constant expression at the language level.)
It's been so long since I used C++ for serious work that we weren't using C++11, so neither auto nor range-for were available. It would be uncommon to see "const type = " with a non-reference type and a non-constant initialiser.
Even with your example, some styles avoid "const auto item", using either "auto item" or "const auto& item" instead, because the "const" matters when taking a reference, not so much with a copy.
But I appreciate your point applies to const variables with non-constant initialisers in general, in the language.
There was once a big deal in literature about const in C++ being the "better" alternative to how #define is commonly used with C for constant values, and it seemed applicable to the thread as a key distinction between C and C++, which the parent commenter seemed to have conflated by mistake.
But I'd forgotten about const (non-reference) variables accepting non-constant initialisers, and as I hadn't used C++ seriously in a while, and the language is always changing, I checked in with a couple of C++ tutorials before writing. Unfortunately those tutorials were misleading or too simple, as both tutoruals said nothing about "const type x = " (non-reference/pointer) being uwed in any other way than for defining compile-time constants.
It's bit embarrssing, as I read other parts of the C++ standard quite often despite not using it much these days. (I'm into compiler guts, atomics, memory models, code analysis, portability issues, etc.). Yet I had forgotten this part of the language.
So, thanks for sending me down a learning & reminder rabbit-hole and correcting my error :-)
I thought the whole point of ranges is to solve problems created by iterators, move semantics to take care of scenarios where nrvo doesn't apply, constexpr and auto because we were hacking around it with macros (if you can even call it that)?
To me, redoing things that are not orthogonal implies that the older version is being fixed. Being fixed implies that it was incorrect. And to clarify, sure, auto types and constexpr are entirely new things we didn't have (auto changed meaning but yeah), but we were trying to "get something like that" using macros.
> To me, redoing things that are not orthogonal implies that the older version is being fixed
The older version is being improved, especially for ergonomics. Regarding your examples, ranges do not obsolete iterators, they are just a convenient way to pass around iterator pairs, but actual range are better implemented in terms of iterators when they are not just a composition of ranges. Similarly move semantics has little to do with nrvo (and in fact using move often is suboptimal as it inhibits nrvo).
Again, I have no idea how constexpr and auto have anything to do with macros.
auto is fixing the problem of long-ass type names for intermediaries thanks to templates and iterators.
Move is fixing the problem of unnecessary mass-construction when you pass around containers.
std::ranges was introduced because dear fucking god the syntax for iterating over a partial container. (And the endless off-by-one errors)
concepts, among other things, fix (sorta) the utter shit show that templates brought to error messages, as well as debacles like SFINAE and std::enable_if.
You're right. They're not fixing problems created by previous features. They're all fixing problems created or made massively worse by templates.
AFAIK, 250W is the net energy of light arriving at the wafer after it has reflected off of many mirrors, with a very inefficient process to generate light from the tin plasma on top of that.
Did you verify their claims or are you just calling BS and that's it? The new functions are in fact much faster than their C equivalent (and yes, I did verify that).
Your original claim "I've not checked but this guy, and by extension the C++ standards committee who worked on this new API, are probably full of shit" was pretty extraordinary.
Look at the compiler-generated instructions yourself if you don't believe the source that I linked; in the cases I've seen all the extra new stuff just adds another layer on top of existing functions and if the former are faster the latter must necessarily also be.
The standards committee's purpose is to justify their own existence by coming up with new stuff all the time. Of course they're going to try to spin it as better in some way.
It compiles from sources, can be better in-lined, benefits from dead code elimination when you don't use unusual radix.
It also don't do locale based things.
I've gone through it too. Twice. The 2nd time I just told them that their process sucks and they said basically "thank you for applying please don't hesitate to try again!"
I didn't even proceed when I saw what they wanted. I thought about how many people are likely to be applying to Canonical, and decided not at all worth the time.
I pretty much refuse to work at any company that has anything resembling an IQ test.
I once found myself unexpectedly being asked to take a wonderlic test, I failed the math portion on purpose then when I went into the actual interview I said roughly "haha, that math section was so hard".
My degree is in CS & Math, I just didn't want to burn the recruiter relationship.
I mean they probably filter for the kind of people that can tolerate and go through this stuff because their positions are also requiring brainless man-machines
Rough transcription; for its reproduction, the wasp injects a cockroach with poison to paralyze it, then stings it in exactly that part of the brain responsible for its flight reflex to make it docile. After knibbling a bit on the cockroach's antennae and enjoying the hemolymph seeping out of the cockrach, the wasp takes the cockroach by its antennae and, even though it is much bigger than the wasp, directs it to its nest like a dog on a leech. In the nest of the cockroach the wasp lays an egg, which soon hatches a larvae that in turn proceeds to eat the cockroach alive while it is still dazed...And this is just the top of the ice berg.
There might be a few, but the complexity of the task reduces the number of options (the most popular choice is probably FreeType+HarfBuzz). But even then there's the problem that FreeType's text rendering might look slightly different than the underlying system's text renderer (so your application stands out like a sore thumb), and text rendering also affects UI layout (just take right-to-left languages for instance) - so it's often not an isolated drop-in solution.
Ideally the underlying system would offer a low level text rendering API that's independent from the system's UI framework and can be combined with your own rendering efficiently (haven't used it, so don't know how good it actually is, but for instance on Windows there's DirectWrite). The problem with this approach is that not all platforms provide such a modular text renderer (for instance web browsers), and you need different code paths on different platforms (maybe even still have to integrate FreeType because one of the target platforms doesn't expose a text rendering API).
Harfbuzz only provides 'text shaping' (very important for UNICODE support, especially for languages like Arabic), but the actual text rendering needs to be done elsewhere.
Their rendering engines choose the OS native one. They have abstractions over that. For example they'll use DirectWrite on Windows. Implementing a partial GUI toolkit is just another thing in the list of extremely hard problems in browser engine development.
reply