| ▲ | ajross 5 hours ago | ||||||||||||||||
> What is the downside of switching to the newest standard when it's properly supported? Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc. [1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++. [2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing. [3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues. | |||||||||||||||||
| ▲ | tcfhgj 3 hours ago | parent [-] | ||||||||||||||||
well, shouldn't not-up-to-date code use the corresponding compiler flag instead of someone starting a greenfield project, who might then write outdated code? | |||||||||||||||||
| |||||||||||||||||