| ▲ | pron 12 hours ago | |
> That's true for signed numbers too though? `int_min - 2 > int_min` As someone else already pointed out, that's undefined behaviour in C and C++ (in Java they wrap), but the more important point is that the vast majority of integers used in programs are much closer to zero than to int_min/max. Sizes of buffers etc. tend to be particularly small. There are, of course, overflow problems with signed integers, but they're not as common. | ||