▲ | uecker 5 days ago | ||||||||||||||||
It is also not what the C community has chosen. It is what was imposed on us by certain optimizing compilers that used the interpretation that gave them maximum freedom to excel in benchmarks, and it was then endorsed by C++. The C definition is that "undefined behavior" can have arbitrary concrete behavior, not that a compiler can assume it does not happen. (that form semantic people prefer the former because it makes their life easier did not help) | |||||||||||||||||
▲ | ralfj 3 days ago | parent [-] | ||||||||||||||||
> The C definition is that "undefined behavior" can have arbitrary concrete behavior, not that a compiler can assume it does not happen. What is the difference between those? How does a compiler that assumes UB never happens violate the requirement that UB can have arbitrary concrete behavior? If we look at a simple example like optimizing "x + y > x" (signed arithmetic, y known to be positive) to "true" -- that will lead to some arbitrary concrete behavior of the program, so it seems covered by the definition. I assume that what the original C authors meant was closer to "on signed integer overflow, non-deterministically pick some result from the following set", but that's not what they wrote in the standard... if you want to specify that something is non-deterministic, you need to spell out exactly what the set of possible choices are. Maybe for singed integer overflow one could infer this (though it really should be made explicit IMO), but C also says that the program has UB "by default" if it runs into a case not described by the standard, and there's just no way to infer a set of choices from that as far as I can see. | |||||||||||||||||
|