| ▲ | pron 3 hours ago |
| > I take a 'living' language any day over of a language that's ossified because of strict backward compatibility requirements Maybe you would, but >95% of serious projects wouldn't. The typical lifetime of a codebase intended for a lasting application is over 15 or 20 years (in industrial control or aerospace, where low-level languages are commonly used, codebases typically last for over 30 years), and while such changes are manageable early on, they become less so over time. You say "strict" as if it were out of some kind of stubborn princple, where in fact backward compatibility is one of the things people who write "serious" software want most. Backward compatibility is so popular that at some point it's hard to find any feature that is in high-enough demand to justify breaking it. Even in established languages there's always a group of people who want somethng badly enough they don't mind breaking compatibility for it, but they're almost always a rather small minority. Furthermore, a good record of preserving compatibility in the past makes a language more attractive even for greenfield projects written by people who care about backward compatibility, who, in "serious" software, make up the majority. When you pick a language for such a project, the expectation of how the language will evolve over the next 20 years is a major concern on day one (a startup might not care, but most such software is not written by startups). |
|
| ▲ | flohofwoe 3 hours ago | parent | next [-] |
| > The typical lifetime of a codebase intended for a lasting application is over 15 or 20 years (in industrial control or aerospace). Either those applications are actively maintained, or they aren't. Part of the active maintenance is to decide whether to upgrade to a new compiler toolchain version (e.g. when in doubt, "never change a running system"), old compiler toolchains won't suddenly stop working. FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial, depending on the complexity of the code base (especially when there's UB lurking in the code, or the code depends on specific compiler bugs to be present - e.g. changing anything in a project setup always comes with risks attached). |
| |
| ▲ | pron 2 hours ago | parent | next [-] | | > Part of the active maintenance is to decide whether to upgrade to a new compiler toolchain version Of course, but you want to make that as easy as you can. Compatibility is never binary (which is why I hate semantic versioning), but you should strive for the greatest compatibility for the greatest portion of users. > FWIW, trying to build a 20 or 30 year old C or C++ application in a modern compiler also isn't exactly trivial I know that well (especially for C++; in C the situation is somewhat different), and the backward compatibility of C++ compilers leaves much to be desired. | |
| ▲ | strawhatguy an hour ago | parent | prev [-] | | You could fix versions, and probably should. However willful disregard of prior interfaces encourages developers code to follow suit. It’s not like Clojure or Common Lisp, where a decades old software still runs, mostly unmodified, the same today, any changes mainly being code written for a different environment or even compiler implementation.
This is largely because they take breaking user code way more seriously. Alot of code written in these languages seem to have similar timelessness too. Software can be “done”. |
|
|
| ▲ | andrepd 2 hours ago | parent | prev [-] |
| I would also add that Rust manages this very well. Editions let you do breaking changes without actually breaking any code, since any package (crate) needs to specify the edition it uses. So when in 30 years you're writing code in Rust 2055, you can still import a crate that hasn't been updated since 2015 :) |
| |
| ▲ | zozbot234 2 hours ago | parent [-] | | Unfortunately editions don't allow breaking changes in the standard library, because Rust codes written in different "editions" must be allowed to interoperate freely even within a single build. The resulting constraint is roughly similar to that of never ever breaking ABI in C++. | | |
| ▲ | kibwen an hour ago | parent [-] | | > The resulting constraint is roughly similar to that of never ever breaking ABI in C++. No, not even remotely. ABI-stability in C++ means that C++ is stuck with suboptimal implementations of stdlib functions, whereas Rust only stabilizes the exposed interface without stabilizing implementation details. > Unfortunately editions don't allow breaking changes in the standard library Surprisingly, this isn't true in practice either. The only thing that Rust needs to guarantee here is that once a specific symbol is exported from the stdlib, that symbol needs to be exported forever. But this still gives an immense amount of flexibility. For example, a new edition could "remove" a deprecated function by completely disallowing any use of a given symbol, while still allowing code on an older edition to access that symbol. Likewise, it's possible to "swap out" a deprecated item for a new item by atomically moving the deprecated item to a new namespace and making the existing item an alias to that new location, then in the new edition you can change the alias to point to the new item instead while leaving the old item accessible (people are exploring this possibility for making non-poisoning mutexes the default in the next edition). |
|
|