> It's a branding problem. They should probably be viewed as different flavors.
They are already different language versions. They're specified in entirely different standards. I don't see what's left to be confused about. At most, perhaps the C++ standard committee could be criticized for repeatedly going out of their way to maximize backward compatibility.
> If every time they're going to add things, remove things and break things then we're in practice talking different strands.
They are already different standard. What's there to miss?
> I know you can do that at the linker and with makefiles and compile flags, this is about a more sane presentation.
This take doesn't make sense. The C++ version being used in a project is a property of the project, not of the translation unit or individual files. A project is comprised of multiple declarations and corresponding definitions, which are spread around and reused and make sense as a whole. It would not make sense to, say, have a translation unit comprised of X C++11 definitions mixed with Y C++20 definitions.
Historically after you create object files the linker doesn't care what c++ standard the source was. So you could carefully combine different standards. I guess I have to establish I'm talking about the GNU toolchain here and that it's been a few years since I've done this. I'll try it again when I get home, maybe that all blows up now.
Now about the other parts. I totally agree with you. However we're dealing with humans and if they see incrementing numbers then the word "upgrade" and "deprecated" and "unsupported", maybe even "inefficient" gets bandied about just because we're using numbers.
We have to go back to the core lesson of Perl 6, it shouldn't have been called Perl 6 because it suggests a hierarchical comparison and relationship that isn't an accurate depiction of reality.
I wish everyone was sincere and competent but there's a natural tenancy to act based on the context that a structure affords.
Churchill stated it as "we shape our buildings and afterwards our buildings shape us".
So if the standards are better understood as siblings of each other then we need to brand them accordingly and not through a structure that suggests hierarchy, quantity of features, and degrees of relevance.
> Historically after you create object files the linker doesn't care what c++ standard the source was.
Mechanically this is true, but just because we can link object files together doesn't mean the resulting program makes sense.
Suppose I have an object file I made with GCC's copy-on-write C++ 98 strings and then I linked that to an object file I made with GCC's modern C++ 11 short string optimised strings. If these objects think they're talking about the same string then the resulting executable is nonsense and will probably crash or misbehave badly.
It might be helpful to think of WG21's attitude to compatibility as occupying a sort of "strategic ambiguity" akin to the US stance on Taiwan. For example when it comes to ABI changes, the committee voted that they shouldn't happen... yet. Not that they will happen within some horizon, but nor that they won't happen.
Maybe you have a deeper understanding of the GNU GCC toolchain than I do but I'm able to link together languages with far more dramatic divergences than that.
For instance, Go and C : https://go.dev/doc/install/gccgo (it's pretty far down, let me quote: "The name of Go functions accessed from C is subject to change. At present the name of a Go function that does not have a receiver is prefix.package.Functionname. The prefix is set by the -fgo-prefix option used when the package is compiled; if the option is not used, the default is go. To call the function from C you must set the name using a GCC extension.")
I just had a small C program call a fmt.Println from a go library at my console to confirm. extern the declaration in C to match the calling convention of go, compile them to objects, then ld with the appropriate libs.
Of course you can break the friendship they can have, this is programming, that's easy to do.
This can be demonstrated with different C++ versions as well. The C/Go example was meant to show how extremely different languages can interact when you're careful enough in your build process.
Not only it is possible, it is routinely done. At $WORK we have libraries that are use std=20 features in its implementation and only expose a c++=14 interface because that's what expected by our clients.
It is mildly painful, bit not more painful than restricting to c++14.
> At $WORK we have libraries that are use std=20 features (...)
That's specified at the package/project level. You're inadvertently proving my point.
> (...) and only expose a c++=14 interface because that's what expected by our clients.
That's also configured at the package level, because the interface headers need to be included in translation units from projecta configured tu be C++14.
Again, you're also inadvertently supporting the point I made.
If every time they're going to add things, remove things and break things then we're in practice talking different strands.
Imagine some preprocessor where you can mix them like
#flavor(ginger)
Instead of say c++11 and then proceed with whatever flavor as necessary.
I know you can do that at the linker and with makefiles and compile flags, this is about a more sane presentation.