Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To be a bit pedantic, the people using the C++ standard are mostly compiler engineers.


Everyone writing C++ is following the standard. The people that often reference it directly are the compiler people. But if we are going to try and be stupid about it we can ignore the compiler engineers opinions too - they're just implementing an arbitrary specification that won't impact anyone but them anyway right?


Well no, users of C++ compilers are using an abstract, encapsulated, and portable interface to it, which makes their code automatically follow the standard (if it cares at all). That's the whole point, isn't it? I don't need to know what size the standard dictates const char* must be, I can just use it and it works.


I mean sure, so for those people WG21 does them a great service, in most cases they can say "Oh that's IFNDR" [Ill-formed; No Diagnostic Required] or "That's UB" and so it's not the compiler's fault your program doesn't work and they can close your ticket as NOTABUG if they want.

My guess is that all or almost-all non-trivial C++ projects trigger IFNDR and thus have no actual meaning per the ISO standard language. They do work, more or less, but that's just convention and can be taken away at any time or for any reason. Header files, use of concepts, unfortunate algorithmic goofs, careless naming, there are a billion† ways to trigger IFNDR in C++ and by definition that's fatal to your correctness.

† A billion is likely an overestimate, but WG21 has no formal inventory of IFNDR in the sprawling ISO document so, who knows.


It would be nice if everyone's compiler would tell their users what crazy thing they did wrong, it's just not required in order to implement C++ at all. That seems reasonable to me. Enhancing a failure-mode UX for an abstract machine interface is not really the same thing as implementing that interface. Maybe it would be nicer if the standard didn't have any possible UB in it.


Oh it's not that it doesn't tell you what you did wrong, it has no idea it's wrong, the C++ language is deliberately designed assuming you never make such mistakes.

It's not difficult, it's impossible, by Rice's Theorem.

It's slightly alarming that anybody thought "Yeah the programmers will always do this correctly" was good enough, but see the rest of our society.


UB is an example of IFNDR right?

Are you saying it’s impossible for the compiler to identify and warn on UB? That doesn’t sound right.


Oh! No, IFNDR is much worse. Categorically so.

Undefined Behaviour is a runtime thing. For example, suppose I have a function which takes an integer between 0 and 9 inclusive and indexes into an size 10 array, but I discover that due to a bug it's possible if a user hits the "Expedite Gyro" button then as well as expediting the gyro, it calls my function with a value which can be up to 26. Indexing 26 into a size 10 array is Undefined Behaviour. Absolutely anything (possible) might happen. But this is runtime problem, the program is still correct unless you actually hit the button.

Undefined Behaviour is more common in a language like C++, but it exists in many safer languages, for example Go has UB for complex data races. It's hard to design this out of your language but some languages do (the safe Rust subset, the WUFFS language for example). It would be reasonable to argue that in such languages they "identify and warn" on UB, in the sense that programs with UB are rejected.

We can also detect at runtime that UB occurred, and say, report it or stop the software if that seems appropriate. To do this properly we need to know all possible causes of UB and detect them in the moment before they happen, while our behaviour is still well defined. Of course if we're going to all that effort maybe we should just reject programs with Undefined Behaviour up front ?

Ill-Formed; No Diagnostic Required isn't like that, IFNDR is about the process of translating the program, typically during compilation and thus long before runtime. It's a way to dodge Rice's Theorem. Henry Rice got a Maths PhD 70+ years ago for showing that the question of whether a program satisfies non-trivial semantic (as opposed to syntactic) requirements is Undecidable.

IFNDR says that's OK, we just won't ask, if our program doesn't satisfy the semantic requirements it had no meaning whatsoever and so we don't care what the transformation does, Garbage In: Garbage Out. This allows us to write all correct programs - we never checked they're correct but in principle they work. Unfortunately the price is that we have no idea whether all/ some/ any of our programs are correct.

The other way to dodge Rice's Theorem is practised by Rust. Check semantic requirements for each program and if you can't see why it satisfies the semantic requirements then reject the program. The price is that some correct programs won't compile at all. For example before "Non-lexical lifetimes" some really easy obviously correct Rust doesn't compile, because the Rust borrow checker used to assume borrows live until the end of a lexical scope.

I believe that incentive structure means IFNDR is toxic, because it encourages the language to add more and more unchecked semantic requirements, since they're "free". Their consequence is that your program is now nonsense and might do anything, but there's no errors, no sign anything is worse so you have no reason to complain. In contrast the incentives for Rust are to improve the semantic checking, because programmers are annoyed when they can see why their program is correct but the compiler is too dumb.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: