▲ | ralfj 3 days ago | |||||||
> The C definition is that "undefined behavior" can have arbitrary concrete behavior, not that a compiler can assume it does not happen. What is the difference between those? How does a compiler that assumes UB never happens violate the requirement that UB can have arbitrary concrete behavior? If we look at a simple example like optimizing "x + y > x" (signed arithmetic, y known to be positive) to "true" -- that will lead to some arbitrary concrete behavior of the program, so it seems covered by the definition. I assume that what the original C authors meant was closer to "on signed integer overflow, non-deterministically pick some result from the following set", but that's not what they wrote in the standard... if you want to specify that something is non-deterministic, you need to spell out exactly what the set of possible choices are. Maybe for singed integer overflow one could infer this (though it really should be made explicit IMO), but C also says that the program has UB "by default" if it runs into a case not described by the standard, and there's just no way to infer a set of choices from that as far as I can see. | ||||||||
▲ | uecker 3 days ago | parent [-] | |||||||
"arbitrary concrete behavior" means that at this point anything can happen on the real machine. This implies that everything before this point has to behave according to the specification. "is impossible" is stronger, as the whole program could behave erratically. But having partial correctness is important in a lot of scenarios and this is why we want to have it and in "UB" it is the former and not "impossible". In the ISO C standard, we use "unspecified" for a non-deterministic choice among clearly specified alternatives. So this is well understood. | ||||||||
|