Remix.run Logo
withzombies 5 hours ago

Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.

cogman10 5 hours ago | parent | next [-]

There's a bootstrapping process that has to happen to compile the compiler. Moving up the language standard chain requires that compilers compiling the compiler need to also migrate up the chain.

So you can never be perfectly bleeding edge as it'd keep you from being able to build your compiler with an older compiler that doesn't support those bleeding edge features.

Imagine, for example, that you are debian and you want to prep for the next stable version. It's reasonable that for the next release you'd bootstrap with the prior releases toolset. That allows you to have a stable starting point.

stabbles 5 hours ago | parent | next [-]

This is not the case. They are discussing the default value of `g++ -std=...`. That does not complicate bootstrapping as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.

cogman10 4 hours ago | parent [-]

> as long as the C++ sources of GCC are compatible with older and newer versions of the C++ standard.

I've worked on a number of pretty large projects. If the target for the source code changes it can be really hard to keep C++20 features from creeping in. It means that you either need to explicitly build targeting 11, or whoever does code reviews needs to have encyclopedic knowledge of whether or not a change leaked in a future feature.

It is "doable" but why would you do it when you can simply keep the compiler targeting 11 and let it do the code review for you.

bluGill 3 hours ago | parent | next [-]

Compilers often allow things in 11 that technically are not there until some later standard. Or sometimes things they have always allowed finally got standardized in a later version. Setting your standard to 11 if that is what you want to target it a good first step but don't depend on it - the real tests is all the compilers you care to support compile your code.

Even if you only target 11, there may be advantages to setting a newer version anyway. Sometimes the standard finally allows some optimization that would work, or disallows something that was always error prone anyway. I would recommend you set your standard to the latest the compiler supports and fix any bugs. Solve your we have to support older standards problem by having your CI system build with an older compiler (and also the newest one). C++ is very good at compatibility so this will rarely be a problem.

quietbritishjim 4 hours ago | parent | prev [-]

> ... why would you do it when you can simply keep the compiler targeting 11 ...

It doesn't appear to me that the parent comment was implying otherwise.

The default is changing for any compilation that doesn't explicitly specify a standard version. I would have thought that the build process for a compiler is likely careful enough that it does explicitly specify a version.

cogman10 4 hours ago | parent [-]

> It's the type of dog fooding they should be doing! It's one reason why people care so much about self-hosted compilers, it's a demonstration of maturity of the language/compiler.

I could be misreading this, but unless they have a different understanding of what it means to dog fooding than I do then it seems like the proposal is to use C++20 features in the compiler bootstraping.

ziotom78 4 hours ago | parent [-]

I believe they are really referring to the default mode used by GCC when no standard is explicitly stated.

The email mentions that the last time they changed it was 5 years ago in GCC 11, and the link <https://gcc.gnu.org/projects/cxx-status.html#cxx17> indeed says

> C++17 mode is the default since GCC 11; it can be explicitly selected with the -std=c++17 command-line flag, or -std=gnu++17 to enable GNU extensions as well.

which does not imply a change in an obscure feature (bootstrapping) that would only affect a few users.

rmu09 5 hours ago | parent | prev | next [-]

Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.

maxlybbert 4 hours ago | parent | next [-]

Correct, this is a discussion of which language version the compiler should follow if the programmer doesn’t specify one. It’s not about which features are acceptable when implementing the compiler.

cogman10 5 hours ago | parent | prev [-]

The way I read withzombies's comment (and it could be wrong) was they were talking about the language version of the compilers source. I assumed that from the "dogfooding" portion of the comment.

kstrauser 4 hours ago | parent | prev [-]

Counterpoint: you could write a C++ compiler in a non-C/C++ language such that the compiler’s implementation language doesn’t even have the notion of C++20.

A compiler is perfectly capable of compiling programs which use features that its own source does not.

cxr 4 hours ago | parent [-]

That's not a counterpoint—at least not to anything in the comment that you're (nominally) "responding" to.

So why has it been posted it as a reply, and why label it a counterpoint?

kstrauser 3 hours ago | parent [-]

Read them again a couple more times and it may become clear.

The prior post seemed to be claiming that this required any form of a bootstrapping process, when it does not.

cxr 3 hours ago | parent | next [-]

You have lost the plot, and you are wrong.

wat10000 2 hours ago | parent | prev [-]

This particular compiler does require bootstrapping, and that's obviously what "the compiler" is referring to in that comment.

Building your compiler in another language doesn't help at all. In fact, it just makes it worse. Dogfooding C++20 in your compiler that isn't even built in C++ is obviously impossible.

kstrauser 2 hours ago | parent [-]

It absolutely does not. There is no part of C++20 that requires the implementing compiler to be written in C++20.

My original point is that you can write a compiler for any language in any language.

cxr 32 minutes ago | parent | next [-]

> My original point is that you can write a compiler for any language in any language.

A perfectly fine observation on its own—but it's not on its own. It's situated in a conversational context. And the observation is in no way a counterpoint to the person you posted your ostensible reply to.

Aside from that, you keep saying "bootstrapping" as in whether or not this or that compiler implementation strategy "requires bootstrapping". But writing a compiler in different source language than the target language it's intended to compile and using that to build the final compiler doesn't eliminate bootstrapping. The compiler in that other language is just part of the bootstrapping process.

wat10000 2 hours ago | parent | prev [-]

What is "It absolutely does not" responding to? I didn't say anything about a C++20 compiler needing to be written in C++20.

kstrauser 40 minutes ago | parent [-]

You said:

> This particular compiler does require bootstrapping, and that's obviously what "the compiler" is referring to in that comment.

You have to pick an option: either it requires bootstrapping, or it doesn’t.

As it’s possible to write the C++20 compiler features in C++11 (or whatever GCC or Clang are written in these days), it factually does not require bootstrapping.

unclad5968 5 hours ago | parent | prev | next [-]

Well there are still some c++20 items that aren't fully supported, at least according to cppref.

https://en.cppreference.com/w/cpp/compiler_support/20.html

withzombies 2 hours ago | parent [-]

Yeah, I think it's because none of the compilers are obligated to support the standard and things get added that never get implemented.

A good example is the C++11 standard garbage collection! It was explicitly optional but afiak no one implemented it.

https://isocpp.org/wiki/faq/cpp11-library#gc-abi

dagmx 4 hours ago | parent | prev | next [-]

This is about changing the default.

The issue with defaults is that people have projects that implicitly expect the default to be static.

So when the default changes, many projects break. This is maybe fine if it’s your own project but when it’s a few dependencies deep, it becomes more of an issue to fix.

reactordev 4 hours ago | parent | next [-]

If you’re relying on defaults, and upgrade, that is entirely your fault. Don’t hold everyone in the world back because you didn’t want to codify your expectations.

dietr1ch 3 hours ago | parent | prev | next [-]

So it has the added benefit of having people learn how to set up their projects properly? Great.

bluGill 3 hours ago | parent | prev | next [-]

C++ is very good at compatibility. If your code breaks when the standard changes, odds are it was always broke and you just didn't know. C++ isn't perfect, but it is very good.

wat10000 2 hours ago | parent [-]

On the other hand, if you didn't know your code was broken then it probably wasn't broken in a way that's catastrophic to whatever you use it for.

MichaelZuo 4 hours ago | parent | prev [-]

That sounds more like a problem of nonsensical assumptions… what possible expectation could there have been that GCC would never change this in the future?

jayd16 4 hours ago | parent [-]

The assumption is along the lines of "this works so why should I ever think about it again if I don't have to?"

It's not an end user problem, anyway. The issue is the language didn't change in a backwards compatible way and also didn't require setting a language version.

andsoitis 5 hours ago | parent | prev | next [-]

> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

C++ standards support and why C++23 and C++26 are not the default: https://gcc.gnu.org/projects/cxx-status.html

BeetleB 3 hours ago | parent | prev | next [-]

> What is the downside of switching to the newest standard when it's properly supported?

"Properly supported" is the key here. Does GCC currently properly support C++23, for example? When I checked a few months ago, it didn't.

jcelerier 3 hours ago | parent [-]

Where do you draw the line for properly supported? I've been using g++ in c++23 mode for quite some time now - even if every feature is not entirely implemented, the ones that work, work well and are a huge improvement

tlb 2 hours ago | parent [-]

I draw the line where I can't expect the default gcc on most Linux and Mac systems to compile my code. And I don't want to force them to install a particular compiler. -std=c++20 seems to work pretty reliably these days.

We're starting to need caniuse.com for C++.

1718627440 5 hours ago | parent | prev | next [-]

> What is the downside of switching to the newest standard when it's properly supported?

They are discussing in this email thread whether it is already properly supported.

> It's one reason why people care so much about self-hosted compilers

For self-hosting and bootstrapping you want the compiler to be compilable with an old version as possible.

binary132 4 hours ago | parent | prev | next [-]

A lot of software, and thus build automation, will break due to certain features that become warnings or outright errors in new versions of C++. It may or may not be a lot of work to change that, and it may or may not even be possible in some cases. We would all like there to be unlimited developer time, but in real life software needs a maintainer.

withzombies 2 hours ago | parent [-]

I'm not talking about software compiled by the compiler having a higher default.

Warnings becoming errors would be scoped to gcc itself only, and they can fix them as part of the upgrade.

hulitu 2 hours ago | parent | prev | next [-]

> Shouldn't the compilers be on the bleeding edge of the standards? What is the downside of switching to the newest standard when it's properly supported?

cursing because the old program does not compile anymore No.

withzombies 2 hours ago | parent [-]

No old programs wouldn't be able to compile anymore with the proposed change

superkuh 5 hours ago | parent | prev | next [-]

When a language changes significantly faster than release cycles (ie, rust being a different compiler every 3 months) it means that distros cannot self-host if they use rust code in their software. ie, with Debian's Apt now having rust code, and Debian's release cycle being 4 years for LTS, Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters. The entire language culture is built around this rapid improvement.

I love that C++ has a long enough time between changing targets to actually be useful and that it's culture is about stability and usefulness for users trying to compile things rather than just dev-side improvements uber alles.

surajrmal 4 hours ago | parent | next [-]

The problem you mention is perhaps a sign that the model Debian uses is ill suited for development. Stable software is great but it need not impede progress and evolution. It's also possible to support older rust compiler versions if it's important - apt developers can do the work necessary to support 4yo lts compilers.

mustache_kimono 4 hours ago | parent | prev [-]

> Debian's shipped rustc won't be able to compile Apt since nearly all rust devs are bleeding edge targeters.

This is nonsense. Apt devs can target a rustc release and that release can be the same release that ships with Debian? Moreover, since those apt devs may have some say in the matter, they can choose to update the compiler in Debian!

> The entire language culture is built around this rapid improvement.

... Because this is a cultural argument about how some people really enjoy having their codebase be 6 years behind the latest language standard, not about any actual practical problem.

And I can understand how someone may not be eager to learn C++20's concepts or to add them immediately to a code base, but upgrades to your minimum Rust version don't really feel like that. It's much more like "Wow that's a nifty feature, I immediately understand and I'd like to use in the std lib. That's a great alternative to [much more complex thing...]" See, for example, OnceLock added at 1.70.0: https://doc.rust-lang.org/std/sync/struct.OnceLock.html

ajross 5 hours ago | parent | prev [-]

> What is the downside of switching to the newest standard when it's properly supported?

Backwards compatibility. Not all legal old syntax is necessarily legal new syntax[1], so there is the possibility that perfectly valid C++11 code exists in the wild that won't build with a new gcc.

[1] The big one is obviously new keywords[2]. In older C++, it's legal to have a variable named "requires" or "consteval", and now it's not. Obviously these aren't huge problems, but compatibility is important for legacy code, and there is a lot of legacy C++.

[2] Something where C++ and C standards writers have diverged in philosophy. C++ makes breaking changes all the time, where C really doesn't (new keywords are added in an underscored namespace and you have to use new headers to expose them with the official syntax). You can build a 1978 K&R program with "cc" at the command line of a freshly installed Debian Unstable in 2025 and it works[3], which is pretty amazing.

[3] Well, as long as it worked on a VAX. PDP-11 code is obviously likely to break due to word size issues.

tcfhgj 3 hours ago | parent [-]

well, shouldn't not-up-to-date code use the corresponding compiler flag instead of someone starting a greenfield project, who might then write outdated code?

ajross 3 hours ago | parent [-]

No? The "corresponding compiler flag" is a new feature. I mean, who told folks at Bell Labs in 1978 how the GCC --std= arguments would work in the coming decades? Legacy code is legacy, it doesn't know it needs to use the correct flags. When it was a greenfield project, it was the default!

Like, think about it: if you think the defaults should be good for greenfield projects, then greenfield projects won't be using the correct flags (because if they are, then the whole argument is specious anyway). And when C++34 shows up, they're going to be broken and we'll have this argument again.

Compatibility is hard. But IMHO C++ and gcc are doing this wrong and C is doing it much better.

plorkyeran 20 minutes ago | parent [-]

GCC's default has already changed once (to C++11). It did not cause any significant problems, and any software which is relying on the current value was created long after the flags to pick a standard version were added.