Thanks to Herb and all the others who worked so hard on C++23. As someone who has made their living with C++ for 25 years now, I’m very grateful to all the people who keep making the language and my professional life better every 3 years.
I've been using C++ a lot lately after many years away. There are different kinds of frustration. With C++ my frustration is the "this is so close to being awesome" variety, not the "wow, this just sucks" variety.
It feels like the C++ committee should be working on a "C++ Safe Edition 2025" as an opinionated and radical overhaul of the language. Absent something is radical, govs & industry leaders are going to continue to call for the death of C++.
Given work on Circle (, Carbon, etc.) -- a "radically opinionated" Edition of C++ as a fork in the standard seems required (where Edition = Fork).
Incrementalism towards a "more polished" C++ won't be enough; not least due to branding.
Herb Sutter is working on a standalone, second syntax for C++ called "Cpp2" [1] to "reduce complexity 10x, increase safety 50x, improve toolability 10x, evolve more freely for another 30 years". [2]
Compiler Explorer lists it under Cpp2-cppfront. [3]
But isn't such a language already there? It's called the D Programming Language [1]. Sorry a bit tongue-in-cheek, but I'm getting tired of all these 'even-more-awesome' new programming language variants of C++.
D could have been that. Back when it was released, D was so far ahead of C++98. However, D decided to go with garbage collection which kept it from being a replacement for C++. In addition, the D garbage collector never received the massive investment that Java’s garbage collector has received.
The problem was never the GC per se, see TinyGo, Astrobe, microEJ, Meadows among others, rather always chasing the next big thing that could bring users, while leaving the other behind not fully done.
Now C++, Java and C# have many capabilities where D had an upper had back when Andrei's book came out, and naturally new competition also came into play with better corporate support.
Why is this bad? I am tired of the garbage collection thing.
If you make extensions for R/Tcl/Lisps for computations, plotting, abstract mathematics,
why the Garbage collection is bad?
You have native performance!
The only complaint is that D might think of adding some of the ergonomics of ML-family of languages. But it has its opinion (like Lisps) and I respect that.
They had something like this. It was called Core Guidelines.
I think it was a bit of a misnomer b/c it sounded like some glorified style guide. But it was actually a set of tools and helpers/wrappers that effectively made it impossible to write a lot of unsafe pre-modern C++.
In principle it seems to kinda strikes the right balance where the language is narrowed - it's still all valid C++ (not an entirely new thing), but not all C++ is valid under the Core Guidelines
I'll be honest - I never tried it myself so I don't know it's deficiencies. Maybe someone who's given it a good try could weight in
I reference the cpp core guidelines all the time. It's not dead at all, things like clang-tidy make references to it constantly for rationale behind recommendations.
There's only a few things I'm not a fan of, mostly around mutable out parameters (I prefer pointers not refs to make it clear at the call site it's a potential mutation)
reference as in a style-guide? Or are you actually using the guidelines support library with things like gsl::index ? I've never seen it used in the wild
I don't use the gsl, no, but I do use linters (like clang-tidy) that enforce a decent chunk of the guide. And reference like a style guide (or any documentation really), yes.
> but I do use linters (like clang-tidy) that enforce a decent chunk of the guide. And reference like a style guide (or any documentation really), yes.
That's my experience as well. Clang-tidy, clang-format, cppcheck and Core Guidelines. I don't recall the last project I've worked that didn't used that combo.
I don't believe this is true at all. I don't recall working on a single C++ for the last half dozen years which hasn't adopted the Core Guidelines up to the code review level.
Of course it's a blend of survivorship bias to claim that no project uses the Core Guidelines if you personally fail to consider it to start with.
Remarkable. This is the first time I've ever heard of the "Core Guidelines".
Seems like something I would be interested in; I've increasingly felt overwhelmed by the endlessly growing scope of C++, and the challenge of understanding which parts are and are not "Modern C++".
Given the traction safe languages like rust, C#, typescript/javascript, swift, etc. have, I suspect the time for this is long past -- if there ever was a time.
I think the problem is a "safe C++" might look a lot like C++ but will have enough fundamental differences to be like a new language from a practical standpoint... devs will have to learn and use new patterns, new tools. Old code will need to be ported and possibly significantly rewritten, since we're talking about memory and threads. At that point, you might find using one of these other safe languages a better option.
Give me C# with RAII and native AOT compilation and I basically don't need to use C++ again. That's close enough to regular C++ that a lot of programs wouldn't need major redesigns.
using and/or finally are things you have to get right every time you instantiate the class, and if you get it wrong you no longer get deterministic destruction. Compare with C++, where RAII is the default policy and you have to take extra steps to get dynamic lifetimes.
> I think the problem is a "safe C++" might look a lot like C++ but will have enough fundamental differences to be like a new language from a practical standpoint
C++ already crossed that threshold many years ago. Today's C++ is essentially a different language from C++ of yore. But it maintains compatibility with the old C++.
This is, I think, the central flaw in modern C++. Because of trying to keep that compatibility, so many newer features are bags-on-the-side and the language is bordering on unmanageable.
I think it would have been better to have branched off from C++, given it a new name, and waved goodbye to code compatibility. It would have been a better language for it.
Nah, I disagree. Rust fills the void of "natively compiled safe language". If you need something that's "like C++" (whatever that means to you) but safe, Rust may not work for you. And it's certainly not true that no one needs a safe C++. If someone right now starts a new project in C++ and not Rust it's because that's what they think is best all things considered, not because they don't need memory safety.
> If someone right now starts a new project in C++ and not Rust it's because that's what they think is best all things considered, not because they don't need memory safety.
I guess most reasons are 'use what we know well', and 'use what we have tooling for'. But there's probably some others.
I think you’d be hard pressed to name another case where there that kind of breaking change was very successful. I’d highlight the python 2->3 change as not something to replicate. Same with what has happened to Perl.
There are some arguments to be made they didn't break enough.
For example:
- I think they bungled the Unicode transition by being too dogmatic and it took them a few point releases to include some quality of life improvements for migration that really allowed legacy libraries to transition
- the GIL dependency didn't go away
- typing was added later
Python 3 would have been a huge jump with those changes. Instead, Python 3.0 was very interesting but a major annoyance for limited benefits to 99% of Python app devs.
There's some serious work going on with the GIL these days. And also about immutability and reducing reference counting.
> [...] it took them a few point releases to include some quality of life improvements for migration that really allowed legacy libraries to transition
I wouldn't hold that against the transition too much. No one gets everything right from the start.
About typing: owing to Python's culture, I don't think mainstream Python will be statically typed by default (or even obligatorily) anytime soon.
That would imply that it actually finished, which is not obvious to me. The last time I found Python 2 code must have been a few months ago, and I'm not a regular Python user. And that's kind of the problem: old code never just goes away. It lingers until you try to compile it and it fails.
Old C++ code -- even very old C++ code -- still compiles fine with the latest C++ compilers. Old Python code does not. That's a huge, and critically important, difference.
I don't know why (maybe someone can explain?) but frontend developers seem very happy to frequently move between languages and frameworks. I don't think that's true for the places where C++ is popular.
C++ is (primarily) used where there aren't any other viable alternatives (other than C).
Browser frontends (in general) set not such demands (size, memory, speed guarantees) and its bloat more often comes from things outside of code (images, videos, trackers), so there's freedom to use whatever floats ones boat.
Combined with the ever increasing demands on app complexity, frameworks and born left and right. The strong movement into functional programming in this area has also led the way for other languages (SMLs, Lisps) that better fit that model.
you would be wrong - i used to be a c++ developer working in finance, and I and my colleagues routinely used at least half a dozen other languages, and frameworks.
c++, c, java vb, python, js, various stored proc languages, bash scripts were all used on the last project i worked on, along with tibco, reuters, com, corba frameworks. plus certainly stuff i have forgotten.
If you ask me, the GSL [1] alone is a fairly radical departure from C++ that delivers a lot of safety. I don't know if it's gotten much popularity, though. Probably because it introduces a similar disruption like you might find from a brand new programming language.
I feel that TitaniC++ has ran into the iceberg of memory safety and complexity.
Bjarne Stroustrup’s previous article which basically tried to minimize memory safety because it was a subset of program safety shows that Standard C++ really doesn’t have a great answer.
I feel C++ needs to do the following to be able to survive:
1. Adopt language epochs so that the language syntax and semantics can be evolved without having to fight backwards compatibility. For example, the integer promotion rules are very confusing and a source of bugs.
2. Adopt solid dangling reference analysis into the compiler itself (using language epochs above). Static analysis as an external tool or command line flag is not good enough.
With Rust, C++ now has a free, popular competitor language that is also focused on creating zero overhead abstractions, but that can guarantee memory safety and which also doesn’t have a bunch of language warts. If C++ doesn’t come up with a good story real soon to deal with memory safety at the language level and deal with the mess of language complexity, it will be relegated to a legacy language.
That's not going to happen. There's just too much friction to reach consensus on how to move the language in that kind of direction. It's like trying to get a centipede to walk to the food when none of the legs can agree on the best path to take. Before that happens, it's much more likely that some other language that interoperates well with C++ will reach a useful state (with regards to IDE support and other tools) and will be embraced by the community. By "community" I don't just mean hobbyists, but the major members of the C++ committee: Microsoft, Google, Nvidia, Intel, etc. That language might be Rust or it might be some other language that's yet to appear. It might not even be a single language, but several. I personally think Rust is not a suitable successor to C++.
One thing that rust doesn't (yet) have is a strong presence and toolbox for usage in scientific computing. In scientific computing, when speed matters, Fortran/C/C++ still very much rule and rust won't take their place anytime soon. Julia has a better chance at that, in fact.
* It's too different. This is to be expected because it was never designed to be like C++. For example, if I have an OOP-heavy application, moving it over to Rust is not a matter of fixing up a few ideas here and there like it might be with, say, Java or C#. It calls for a redesign/rewrite. I don't think a memory safe C++ successor would need to be more like Rust than like C++.
* The language is not substantially simpler than C++. I find traits to be particularly difficult to reason about and to follow, to the point that I find myself just trying to find the right incantation that will appease the compiler. And that complexity shows up at build time just like it does with C++.
> I feel C++ needs to do the following to be able to survive [...]
absolutely. unfortunately, saftey oriented proposals tend to focus on a weaker warning-esque approach ([[nodiscard]], contracts, etc.), which, in my opinion, will ultimately only make things worse. i'll stay away from my type-systems-are-great soapbox :)
the refusal to break ABI, remove non-destructive moves, the strengthening of concepts more akin to their original proposal, are not even on the horizon of discussion, despite us having exemplars of all 3.
The refusal to do an ABI break is a crazy move on the C++ committee's part. C++ without an ABI break loses performance, has a bunch of confusing features (eg bucket iterators in hash tables...), and completely prevents a lot of proposals that would make smart pointers work a lot better.
> Adopt language epochs so that the language syntax and semantics can be evolved without having to fight backwards compatibility.
That's what Javascript got surprisingly right IMO: you can target the latest bleeding edge features, and still be able to compile and run them on 20-year-old browsers through code re-writing (aka transpilation) and polyfills [1]
I often wish this was available for other languages, too.
[1] Well, not all features because some if them are notoriously hard to polyfill (IIRC Proxies are one such unpolyfillable feature). And there's the huge issue of the entire world relying on just one guy: https://github.com/zloirock/core-js/blob/master/docs/2023-02...
Linked lists are complicated. You can get something that compiles easily with C++, but to be sure it doesn't leak memory might require you to get that PhD after all.
Linked lists are a very simple data structure, and pretty much everyone can get them right. There haven't been any big memory safety CVEs to my knowledge around linked list manipulation. The same for binary trees.
These pointer-based data structures are used a lot when you are manipulating large structs in C and C++, as they let you "move" then between data structures without copying a ton of memory.
... on the third hand it's better not to roll your own list in any language ;-)
Joking aside, some stuff are currently really hard in Rust, but the rate of change slowed down, and all I can see now is everyday ergonomic improvements, so there is hope for (slightly) simpler syntax in the future.
It's also just generally not a wise structure to be making or using in applications level development at this point. Vectors will outperform them most of the time, are more cache friendly, and easier to write these days. And Rust has already done the work of building collections you can already use, including some that can be used no-std in embedded/baremetal environments.
And if you really need a linked-list type structure, and are competent to make it and use it properly/safely, unsafe {} is waiting for you.
You tend to see linked lists in places like operating systems, storage systems, and stackless asynchronous server systems. These kinds of linked lists actually do perform better than vectors in most cases.
Specifically, linked lists do well when objects are:
1. Large, specifically 100s of bytes
2. Moved frequently between collections
3. Intrusively linked (meaning that the pointers are part of the struct, not a separate library) and
4. Never randomly accessed
One size does not fit all when it comes to performance of list structures, unfortunately. In particular, large items tend to break standard library structures, since they are not the usual case.
Yes, hence why I said "applications level" development specifically. 99% of developers will never need to go this path. Those that do, they're already working down in unsafe land usually in embedded systems and OS development, and they need to think carefully about what they're doing.
I'm personally not entirely sold on Rust's memory safety model, though I did learn to work with it when I was getting paid to work in it.
I don’t think it takes a PhD to write an unsafe linked list in Rust. And an unsafe linked list in Rust would still be safer than the equivalent linked list in C++
How would the equivalent linked list in c++ be unsafe?
The problem with c++ was never writing simple data structures, but was writing complete programs without a single mistake in the informal contracts between functions. Static analysis tools and other crutches help but having a stricter ruleset to enforce is what makes rust a winner.
The list itself would not be unsafe, but the usages of it could be. For example, iterator invalidation, access from another thread, etc that Rust would prevent you from doing even if you used unsafe to implement the linked list.
Rust still needs to prove itself on being a GCC/LLVM alternative, CUDA, SYSCL, HPC and HFT compiler toolchains, MPI, AUTOSAR and MISRA standards, game console SDKs, mobile OS SDKs, ....
C++ still has lot of momentum for the upcoming decades, regardless how much we would like to RIIR everything.
I pretty strongly disagree with this. Rust doesn't replace all of C++; it only replaces some. Specifically, there's not a great reason to write a compiler in a language with manual memory management. HPC is also starting to move to higher level alternatives like Julia.
Also AUTOSTAR/MISRA are funny to bring up because they are desparate and only semi-successful attempts to bring a semblance of safety to fundamentally unsafe languages. You get all of the safety of them (and more) in Rust with the 2 rules of 1. don't use unsafe. 2. have tests.
Julia isn't primarily implemented in C++. Most of the implementation is in Julia (68%). About 25% is C/C++, but that could be changed if someone wrote a good enough compiler in a fast enough memory safe language. The rest is in a custom scheme dialect (femptolisp) and is in the process of being replaced by Julia (see JuliaSyntax.jl).
I'm hoping we get there. LLVM has really good codegen, but I think we can probably do a better job on the optimizer side than they do. Don't expect anything soon though. Replacing LLVM is (kinda obviously) a ridiculously huge project.
> If C++ doesn’t come up with a good story real soon to deal with memory safety at the language level and deal with the mess of language complexity, it will be relegated to a legacy language.
That's exactly the purpose of WG21 SG23, convened for the study of safety and security for C++26 and beyond. You can complain or you can join and influence the future.
I wish C++ would be as easy to build as Golang, and also as easy to include third-party libraries as Golang. Those are the two biggest pain points to using C++ for me.
I would like a standardized unit testing facility, a standardized code documentation facility to be added to that list. Also add a proper interpreter for dev-efficiency!
C++ the language is not at all a problem as much as C++ the tooling and developer ergonomics. For a language that has been in existence for 4 decades, you would think the ergonomics would be better but it seems all the graybeards want us all to suffer the way they did.
Modules look like an awesome, game-changing feature, and they were supposed to be a part of C++20. As far as I know I still can't reasonably use them with Clang and CMake. Anyone know what the problem is? Are enough people being paid to work on C++ implementations? Or is it so difficult that despite the resources, people can't implement it in 3 years?
The problem with clang is called Apple and Google, as they decided to focus on their own languages (Swift, Objective-C, Carbon, C++17 being good enough), the compiler vendors that profit from clang forks haven't cared that much to contribute to upstream.
Meanwhile VC++ and GCC are quite green on C++20 compliance, and GCC modules are almost there as well.
I wonder if I can use GCC on macOS to build apps for macOS/iOS that can be App Store deployed. Or if I'm stuck with Clang there.
These days I still use macOS and exist in the Apple ecosystem. I wouldn't mind moving to Linux, but unfortunately I need a MacBook to do work where I build iOS apps. I wonder if there is a way around that.
Apple is quite clear what are the official languages for iOS apps and what role C++ plays on that, like MSL and DriverKit. Note how many C++ talks happen at WWDC.
You're pretty much stuck with clang, unless you feel like hacking around the toolchain workflows.
> Also, at each meeting we regularly have new attendees who have never attended before, and this time there were 25 new first-time attendees in-person or on Zoom; to all of them, once again welcome!
So they were very inclusive and gave a warm welcome to their convicted rapist and child porn hoarder on the committee, over whom several other members resigned. They might have purged all women instead. I see one irl, and several others over Zoom.
Now I feel it's either making-c++-memory-safe-while-breaking-back-compatibility or die slowly. I was told by c++ experts there is no middle ground.
I actually started to shift focus onward Rust. Rust also provides a more consistent build tooling albeit build time is still very slow. Plus Rust does cross-platform decently. For C++, I need create my CMake manually each time, and cross build also involves manual creation each time.
As someone who (properly) learned modern C++ only some years ago and is just learning Rust, I agree - one needs a lot of learning in Rust before one can get kick-started. C++ is actually far easier to get out of the ground coding.
Modern C++ is actually quite nice to code in. Its just a real pain to get started - project setup, documentation, cicd setup, ease of dependency/pkg management, cross-compilation, unit-tests, etc. Developer ergonomics are poor and not standardised. (There are also some critical features that are missing - like reflection that make things a pain)
Rust the language is far harder to learn (even if it is safer), but the tooling is mouth-wateringly superb.
In today's world, a language can live and die on developer ergonomics alone. Nobody really has the patience to do all the cruft-work that they could tolerate before - there is a lot of choice in the programming landscape.
conan or vcpkg might be able to help, I have used neither though, or a cmake boilerplate might also help, but no such project existed to be used by many, so yeah, the tooling at c++ is still in stone age, too bad.
editted: just checked out conan briefly, it's pretty close to cargo and go toolings, going to use it more
I don't know that much about rust yet, I just assume that if I know c++ fairly well, it might just be 20% harder to learn rust and get good about it, time will tell.
This was not what I thought it was. I assumed it was the `for(auto f : <collection of class>)` issue that STL tried to fix and gave up on. Turns out it was initializers and locks. Extending them to live as long as the loop much like the `if` initializer syntax does for the block.
I was going to say, he's 72 and living in NY - I guess trekking to the very far side of the continental US for the sake of a few pictures, in this day and age, wouldn't sound particularly appealing to him
just to be clear about the abi issues mentioned here, neither c++ nor c have a standardised abi. but there are widely agreed abis for different platforms. standardised language changes can thus not break standardised abis, because the standards don't specify abis.
WG21 can, and has in the past[0], broken platforms' ABIs. It tends to be painful enough that platform vendors (who are represented on the committees) aren't interested in doing it again.