- This is a good article but it only scratches the surface, as is always the case when it comes to C++.
When I made a meme about C++ [1] I was purposeful in choosing the iceberg format. To me it's not quite satisfying to say that C++ is merely complex or vast. A more fitting word would be "arcane", "monumental" or "titanic" (get it?). There's a specific feeling you get when you're trying to understand what the hell is an xvalue, why std::move doesn't move or why std::remove doesn't remove.
The Forest Gump C++ is another meme that captures this feeling very well (not by me) [2].
What it comes down to is developer experience (DX), and C++ has a terrible one. Down to syntax and all the way up to package management a C++ developper feels stuck to a time before they were born. At least we have a lot of time to think about all that while our code compiles. But that might just be the price for all the power it gives you.
- In Linuxland you at least have pkg-config to help with package management. It's not perfect but neither is any other package management solution.
If I'm writing a small utility or something the Makefile typically looks something like this:
CC=clang PACKAGES=libcurl libturbojpeg CFLAGS=-Wall -pedantic --std=gnu17 -g $(shell pkg-config --cflags $(PACKAGES)) LDLIBS=$(shell pkg-config --libs $(PACKAGES)) ALL: imagerunner imagerunner: imagerunner.o image_decoder.o downloader.o
- Consider that to do this you must:
- Use a build system like make, you can't just `c++ build`
- Understand that C++ compilers by default have no idea where most things are, you have to tell them exactly where to search
- Use an external tool that's not your build system or compiler to actually inform the compiler what those search paths are
- Oh also understand the compiler doesn't actually output what you want, you also need a linker
- That linker also doesn't know where to find things, so you need the external tool to use it
- Oh and you still have to use a package manager to install those dependencies to work with pkg-config, and it will install them globally. If you want to use it in different projects you better hope you're ok with them all sharing the same version.
Now you can see why things like IDEs became default tools for teaching students how to write C and C++, because there's no "open a text editor and then `c++ build file.cpp` to get output" for anything except hello world examples.
- It's really not that big of a deal once you know how it works, and there are tools like CMake and IDEs that will take care of it.
On Windows and OSX it's even easier - if you're okay writing only for those platforms.
It's more difficult to learn, and it seems convoluted for people coming from Python and Javascript, but there are a lot of advantages to not having package management and build tooling tightly integrated with the language or compiler, too.
- I can use pkg-config just fine.
Not sure how relevant the "in order to use a tool, you need to learn how to use the tool".
Or from the other side: not sure what I should think about the quality of the work produced by people who don't want to learn relatively basic skills... it does not take two PhDs to understand how to use pkg-config.
- I'm just pointing out that one reason devex sucks in C++ is because the fact you need a wide array of tools, that are non portable, and require learning and teaching magic incantations at the command line or in build scripts to work, doesn't foster what one could call a "good" experience.
Frankly the idea that your compiler driver should not be a basic build system, package manager, and linker is an idea best left in the 80s where it belongs.
- I'm not going to defend the fact that the C++ devex sucks. There are really a lot of reasons for it, some of which can't sensibly be blamed on the language and some of which absolutely can be. (Most of it probably just comes down to the language and tooling being really old and not having changed in some specific fundamental ways.)
However, it's definitely wrong to say that the typical tools are "non-portable". The UNIX-style C++ toolchains work basically anywhere, including Windows, although I admit some of the tools require MSys/Cygwin. You can definitely use GNU Makefiles with pkg-config using MSys2 and have a fine experience. Needless to say, this also works on Linux, macOS, FreeBSD, Solaris, etc. More modern tooling like CMake and Ninja work perfectly fine on Windows and don't need any special environment like Cygwin or MSys, can use your MSVC installation just fine.
I don't really think applying the mantra of Rust package management and build processes to C++ is a good idea. C++'s toolchain is amenable to many things that Rust and Cargo aren't. Instead, it'd be better to talk about why C++ sucks to use, and then try to figure out what steps could be taken to make it suck less. Like:
- Building C++ software is hard. There's no canonical build system, and many build systems are arcane.
This one really might be a tough nut to crack. The issue is that creating yet another system is bound to just cause xkcd 927. As it is, there are many popular ways to build, including GNU Make, GNU Autotools + Make, Meson, CMake, Visual Studio Solutions, etc.
CMake is the most obvious winner right now. It has achieved defacto standard support. It works on basically any operating system, and IDEs like CLion and Visual Studio 2022 have robust support for CMake projects.
Most importantly, building with CMake couldn't be much simpler. It looks like this:
And you have a build in .build. I think this is acceptable. (A one-step build would be simpler, but this is definitely more flexible, I think it is very passable.)$ cmake -B .build -S . ... $ cmake --build .build ...
This does require learning CMake, and CMake lists files are definitely a bit ugly and sometimes confusing. Still, they are pretty practical, and rather easy to get started with, so I think it's a clear win. CMake is the "defacto" way to go here.
- Managing dependencies in C++ is hard. Sometimes you want external dependencies, sometimes you want vendored dependencies.
This problem's even worse. CMake helps a little here, because it has really robust mechanisms for finding external dependencies. However, while robust, the mechanism is definitely a bit arcane; it has two modes, the legacy Find scripts mode, and the newer Config mode, and some things like version constraints can have strange and surprising behavior (it differs on a lot of factors!)
But sometimes you don't want to use external dependencies, like on Windows, where it just doesn't make sense. What can do you really do here?
I think the most obvious thing to do is use vcpkg. As the name implies, it's Microsoft's solution to source-level dependencies. Using vcpkg with Visual Studio and CMake is relatively easy, and it can be configured with a couple of JSON files (and there is a simple CLI that you can use to add/remove dependencies, etc.) When you configure your CMake build, your dependencies will be fetched and built appropriately for your targets, and then CMake's find package mechanism can be used just as it is used for external dependencies.
CMake itself is also capable of vendoring projects within itself, and it's absolutely possible to support all three modalities of manual vendoring, vcpkg, and external dependencies. However, for obvious reasons this is generally not advisable. It's really complicated to write CMake scripts that actually work properly in every possible case, and many cases need to be prevented because they won't actually work.
All of that considered, I think the best existing solution here is CMake + vcpkg. When using external dependencies is desired, simply not using vcpkg is sufficient and the external dependencies will be picked up as long as they are installed. This gives an experience much closer to what you'd expect from a modern toolchain, but without limiting you from using external dependencies which is often unavoidable in C++ (especially on Linux.)
- Cross-compiling with C++ is hard.
In my opinion this is mostly not solved by the "defacto" toolchains. :)
It absolutely is possible to solve this. Clang is already better off than most of the other C++ toolchains in that it can handle cross-compiling with selecting cross-compile targets at runtime rather than build time. This avoids the issue in GCC where you need a toolchain built for each target triplet you wish to target, but you still run into the issue of needing libc/etc. for each target.
Both CMake and vcpkg technically do support cross-compilation to some extent, but I think it rarely works without some hacking around in practice, in contrast to something like Go.
If cross-compiling is a priority, the Zig toolchain offers a solution for C/C++ projects that includes both effortless cross-compiling as well as an easy to use build command. It is probably the closest to solving every (toolchain) problem C++ has, at least in theory. However, I think it doesn't really offer much for C/C++ dependencies yet. There were plans to integrate vcpkg for this I think, but I don't know where they went.
If Zig integrates vcpkg deeply, I think it would become the obvious choice for modern C++ projects.
I get that by not having a "standard" solution, C++ remains somewhat of a nightmare for people to get started in, and I've generally been doing very little C++ lately because of this. However I've found that there is actually a reasonable happy path in modern C++ development, and I'd definitely recommend beginners to go down that path if they want to use C++.
- that idea that packages and builds belongs to simple problem, large projects need things like more than one laguage and so end up fighting the language
- Every modern language seems to have an answer to this problem that C and C++ refuse to touch because it's out of scope for their respective committees and standards orgs
- On the front page right now:
Shai-Hulud malware attack: Tinycolor and over 40 NPM packages compromised (stepsecurity.io)
935 points by jamesberthoty 16 hours ago | flag | hide | 730 comments
Maybe obstreperous dependency management ends up being the winning play in 2025 :)
- Just think of how many _more_ vulns C and C++ could be responsible for if they had package modern managers! :)
- C++ has a plethora of available build and package management systems. They just aren't bundled with the compiler. IMO that is a good thing, because it keeps the compiler writers honest.
- You say that as if Cargo, MSBuild, and pip aren’t massively loved by their communities.
- "Massively loved" and "good decision" are orthogonal axes. See the current npm drama. People love wantonly importing dependencies the way they love drinking. Both feel great but neither is good for you.
- Not that npm-style package management is the best we can do or anything, but I would be more sympathetic to this argument if C or C++ had a clearly better security story than JS, Python, etc. (pick your poison), but they're also disasters in this area.
What happens in practice is people end up writing their own insecure code instead of using someone else's insecure code. Of course, we can debate the tradeoffs of one or the other!
- None of that is a problem
There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle.
Then repeated foot guns going off, no toes left, company bankrupt and banking system crashed, again
- > There are a lot of problems, but having to carefully construct the build environment is a minor one time hassle.
I've observed the existence in larger projects of "build engineers" whose sole job is to keep the project building on a regular cadence. These jobs predominantly seem to exist in C++ land.
- Most of what I have seen came from technical debt aquired over decades. With some of the build engineers hired to "manage" that themselves not being treated as programmers and just adding on top of the mess with "fixes" that are never reviewed or even checked in. Had a fun time once after we reinstalled the build server and found out that the last build engineer created a local folder to store various dependencies instead of of using vcpkg to fetch everything as we had mandated for several years by then.
- I’ve only ever seen this on extraordinarily complex codebases that mixed several languages. Pure C++ scales really well these days.
- How is that even possible?
Wasn't CI invented to solve just this problem?
- You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system where everyone does the bare minimum to meet the near term task only and it devolves into a mess no one wants to touch over enough time.
Your choice: do you have the most senior engineers spend time sporadically maintaining the build system, perhaps declaring fires to try to pay off tech debt, or hire someone full time, perhaps cheaper and with better expertise, dedicated to the task instead?
CI is an orthogonal problem but that too requires maintenance - do you maintain it ad-hoc or make it the official responsibility for someone to keep maintained and flexible for the team’s needs?
I think you think I’m saying the task is keeping the build green whereas I’m saying someone has to keep the system that’s keeping the build green going and functional.
- > You have a team of 20 engineers on a project you want to maintain velocity on. With that many coooks, you have patches on top of patches of your build system ...
The scenario you are describing does not make sense for the commonly accepted industry definition of "build system." It would make sense if, instead, the description was "application", "product", or "system."
Many software engineers use and interpret the phrase "build system" to be something akin to make[0] or similar solution used to produce executable artifacts from source code assets.
0 - https://man.freebsd.org/cgi/man.cgi?query=make&apropos=0&sek...
- I can only relate to you what I’ve observed. Engineers were hired to rewrite the Make-based system into Bazel and maintain it for single executable distributed to the edge. I’ve also observed this for embedded applications and other stuff.
I’m not sure why you’re dismissing it as something else without knowing any of the details or presuming I don’t know what I’m talking about.
- Wow, I don't understand what anything means in those memes. And I'm so glad I don't!
It seems to me that the people/committees who built C++ just spent decades inventing new and creative ways for developers to shoot themselves in the foot. Like, why does the language need to offer a hundred different ways to accomplish each trivial task (and 98 of them are bad)?
- The road to hell is paved with 40 years of backwards compatibility requirements.
- > in C++, you can write perfectly fine code without ever needing to worry about the more complex features of the language. You can write simple, readable, and maintainable code in C++ without ever needing to use templates, operator overloading, or any of the other more advanced features of the language.
Only if you have full control on what others are writing. In reality, you're going to read a lot, lots of "clever" codes. And I'm saying as a person who have written a good amount of template meta programming codes. Even for me, some codes take hours to understand and I was usually able to cut 90% of its code after that.
- I’m probably guilty of gratuitous template stuff, because it adds fun to the otherwise boring code I spend a lot of time on. But I feel like the 90% cutdowns are when someone used copy-paste instead of templates, overloads, and inheritance. I don’t think both problems happen at the same time, though, or maybe I misunderstood.
- When people are obsessed with over-abstraction and over-generalization, you can often see FizzBuzz Enterprise in action where a single switch statement is more than enough.
- Being able to cut 90% of code sounds like someone was getting paid by LoC (which is also a practice from a time when C++ was considered a "modern" language).
- > make it look as much like C as you possibly can, and avoid using too many advanced features of the language unless you really need to.
Oh boy!
This person needs control.
That is where I left C++, a better C
Faint praise
- > in C++, you can write perfectly fine code without ever needing to worry about the more complex features of the language. You can write simple, readable, and maintainable code in C++ without ever needing to use templates, operator overloading, or any of the other more advanced features of the language.
This... doesn't really hold water. You have to learn about what the insane move semantics are (and the syntax for move ctors/operators) to do fairly basic things with the language. Overloaded operators like operator*() and operator<<() are widely used in the standard library so you're forced to understand what craziness they're doing under the hood. Basic standard library datatypes like std::vector use templates, so you're debugging template instantiation issues whether you write your own templated code or not.
- Reminds me of the old quote
> everyone only uses 20% of C++, the problem is that everyone uses a different 20%
- I’ve been programming C++ on a daily basis for more than 20 years and literally never use the >> operator. Never. Not rarely, never.
- I didn't mention >>.
- How do you shift bits to the right?
- right_shifted = value / (1 << bits)
- Division is slow, though. You should use something like:
right_shifted = (int)(value * pow(2, -bits) - 0.5)
- Or just rely on the compiler to automatically do trivial conversions. Which are pretty reliable these days.
- Manipulating bit patterns isn't really that common in most workloads.
- Overloaded operators were a terrible mistake in every programming language I've encountered them in. (Yes, sorry Haskell, you too!)
I don't think move semantics are really that bad personally, and some languages move by default (isn't that Rust's whole thing?).
What I don't like is the implicit ambiguous nature of "What does this line of code mean out of context" in C++. Good luck!
I have hope for C++front/Cpp2. https://github.com/hsutter/cppfront
(oh and I think you can write a whole book on the different ways to initialize variables in C++).
The result is you might be able to use C++ to write something new, and stick to a style that's readable... to you! But it might not make everyone else who "knows C++" instantly able to work on your code.
- Overloaded operators are great. But overloaded operators that do something entirely different than their intended purpose is bad. So a + operator that does an add in your custom numeric data type is good. But using << for output is bad.
- The first programming language that used overloaded operators I really got into was Scala, and I still love it. I love that instead of Java's x.add(y); I can overload + so that it calls .add when between two objects of type a. It of course has to be used responsibly, but it makes a lot of code really more readable.
- I will die on the hill that string concatenation should have its own operator, and overloading + for the operation is a mistake.
Languages that get it right: SQL, Lua, ML, Perl, PHP, Visual Basic.
- I think it's fine when the language has sufficiently strict types for string concatenation.
Unfortunately, many languages allow `string + int`, which is quite problematic. Java is to blame for some of this.
And C++ is even worse since literals are `const char[]` which decays to pointer.
Languages okay by my standard but not yours include: Python, Ruby.
- Alternatively, any implementation of operator+ should have a notional identity element, an inverse element and be commutative.
- > I don't think move semantics are really that bad personally, and some languages move by default (isn't that Rust's whole thing?).
Rust's move semantics are good! C++'s have a lot of non-obvious footguns.
> (oh and I think you can write a whole book on the different ways to initialize variables in C++).
Yeah. Default init vs value init, etc. Lots of footguns.
- Operator overloarding is essential for computer graphics libraries for vector and matrix multiplication, which becomes an illegible mess without.
- I personally think that operator overloading itself is justified, but the pervasive scope of operator overloading is bad. To me the best solution is from OCaml: all operators are regular functions (`a + b` is `(+) a b`) and default bindings can't be changed but you can import them locally, like `let (+) = my_add in ...`. OCaml also comes with a great convenience syntax where `MyOps.(a + b * c)` is `MyOps.(+) a (MyOps.(*) b c)` (assuming that MyOps defines both `(+)` and `(*)`), which scopes operator overloading in a clear and still convenient way.
- A benefit of operator overloads is that you can design drop-in replacements for primitive types to which those operators apply but with stronger safety guarantees e.g. fully defining their behavior instead of leaving it up to the compiler.
This wasn't possible when they were added to the language and wasn't really transparent until C++17 or so but it has grown to be a useful safety feature.
- "you can write perfectly fine code without ever needing to worry about the more complex features of the language. You can write simple, readable, and maintainable code in C++ without ever needing to use templates, operator overloading, or any of the other more advanced features of the language."
You could also inherit a massive codebase old enough to need a prostate exam that was written by many people who wanted to prove just how much of the language spec they could use.
If selecting a job mostly under the Veil of Ignorance, I'll take a large legacy C project over C++ any day.
- It's a fine post except for this:
> Countless companies have cited how they improved their security or the amount of reported bugs or memory leaks by simply rewriting their C++ codebases in Rust. Now is that because of Rust? I’d argue in some small part, yes.
Just delete this. Even an hour's familiarity with Rust will give you a visceral understanding that "Rewrites of C++ codebases to Rust always yield more memory-safe results than before" is absolutely not because "any rewrite of an existing codebase is going to yield better results". If you don't have that, skip it, because it weakens the whole piece.
- C++ will always stay relevant. Software has eaten the world. That transition is almost complete now. The languages that were around when it happened will stay deeply embedded in our fundamental tech stacks for another couple decades at least, if not centuries. And C and C++ are the lion's share of that.
COBOL sticks around 66 years after its first release. Fortran is 68 years old and is still enormously relevant. Much, much more software was written in newer languages and has become so complex that replacements have become practically impossible (Fuchsia hasn't replaces Linux in Google products, wayland isn't ready to replace X11 etc)
- It seems likely that C++ will end up in a similar place as COBOL or Fortran, but I don't see that as a good future for a language.
These languages are not among the top contenders for new projects. They're a legacy problem, and are kept alive only by a slowly shrinking number of projects. It may take a while to literally drop to zero, but it's a path of exponential decay towards extinction.
C++ has strong arguments for sticking around as a legacy language for several too-big-to-rewrite C++ projects, but it's becoming less and less attractive for starting new projects.
C++ needs a better selling point than being a language that some old projects are stuck with. Without growth from new projects, it's only a matter of time until it's going to be eclipsed by other languages and relegated to shrinking niches.
- As long as people write software (no pun intended), software will follow trends. For instance, in many scientific ecosystems, Matlab was successfully replaced by Scipy. Which happens to get replaced by Julia. Things don't neccessarily have to stay the same. Interestingly, such a generational trend currently happens with Rust, despite there has been numerous other popular languages such as D or Zig which didn't have the same traction.
Sure, there are still Fortran codes. But I can hardly imagine that Fortran still plays a big role in another 68 years from now on.
- Matlab/Scipy/Julia are totally different since those function more like user interfaces, they are directly user facing. You're not building an app with matlab (though you might be with scipy and julia, it's not the primary use case), you're working with data. C++ on the other hand underpins a lot of key infrastructure.
- Scipy is a wrapper of Numpy, which is a wrapper of C and Fortran.
- I am not saying that these languages will stay around forever, mind you. But we have solidified the tech stacks involving these languages by making them ridiculously complex. Replacement of a programming language in one of the core components can only come through gradual and glacially slow evolution at this point. "Rewrite it in XYZ" as a clean slate approach on a big scale is simply a pipe dream.
Re Matlab: I still see it thriving in the industry, for better or worse. Many engineers just seem to love it. I haven't seen many users of Julia yet. Where do you see those? I think that Julia deserves a fair chance, but it just doesn't have a presence in the fields I work in.
- I've heard via former employees that Mathworks has conceded that Python ate Matlab's niche and that they're focusing on Simulink
- You’re thinking of software that is being written today. GP is talking about software we use every day in every device on the planet that hasn’t changed since it was written 30+ years ago.
- What is this software? E.g. Linux is 33 years old; barely a few percent of Linux 1.0 remains in a modern kernel, if we count lines of code.
Maybe GNU Emacs has a larger percentage remaining intact; at least it retains some architectural idiosyncrasies from 1980s.
As of Fortran, modern Fortran is a pretty nice and rich language, very unlike the Fortran-77 I wrote at high school.
- Especially the 'backend' languages that do all the heavy lifting for domain-specific software. Just in my vertical of choice, financial software, there are literally billions of lines of Java and .NET code powering critical systems. The code is the documentation, and there's little appetite to rewrite all that at enormous cost and risk.
Perhaps AI will get reliable enough to pour through these double-digit million LOC codebases and convert them flawlessly, but that looks like it's decades off at this point.
- I'm not so sure. The user experience has really crystallized over the years. It's not hard to imagine a smart tv or something like it just reimplementing that experience in hardware in the not too distant future (say 2055 if transistor and memory scaling stall in 2035).
We live in a special time when general processing efficiency has always been increasing. The future is full of domain specific hardware (enabling the continued use of COBOL code written for slower mainframes). Maybe this will be a half measure like cuda or your c++ will just be a thin wrapper around a makeYoutube() ASIC
Of course if there is a breakthrough in general purpose computing or a new killer app it will wipe out all those products which is why they don't just do it now
- > Software has eaten the world.
Bit off more than it could chew, no we all have indigestion
- A pet peeve of mine is when people claim C++ is a superset of C. It really isn't. There's a lot of little nuanced differences that can bite you.
Ignore the fact that having more keywords in C++ precludes the legality of some C code being C++. (`int class;`)
void * implicit casting in C just works, but in C++ it must be an explicit cast (which is kind of funny considering all the confusing implicit behavior in C++).
C++20 does have C11's designated initialization now, which helps in some cases, but that was a pain for a long time.
enums and conversion between integers is very strict in C++.
`char * message = "Hello"` is valid C but not C++ (since you cannot mutate the pointed to string, it must be `const` in C++)
C99 introduced variadic macros that didn't become standard C++ until 2011.
C doesn't allow for empty structs. You can do it in C++, but sizeof(EmptyStruct) is 1. And if C lets you get away with it in some compilers, I'll bet it's 0.
Anyway, all of these things and likely more can ruin your party if you think you're going to compile C code with a C++ compiler.
Also don't forget if you want code to be C callable in C++ you have to use `extern "C"` wrappers.
- > C++20 does have C11's designated initialization now, which helps in some cases, but that was a pain for a long time.
C++ designated initializers are slightly different in that the initialization order must match the declared member order. That is not required in C.
- It also completely negates their utility, even though the exact "problem" they always bring up is already solved when you use normal constructors.
- > You can do it in C++, but sizeof(EmptyStruct) is 1.
Unless you use the C++20 [[no_unique_address]] attribute, in which case it is 0 (if used correctly).
- The complexity argument is just not true. You do have to know this stuff in c++, you run into it all the time.
I wish I didn’t have to know about std::launder but I do
- I feel like C++ is a bunch of long chains of solutions creating problems that require new solutions, that start from claiming that it can do things better than C.
Problem 1: You might fail to initialize an object in memory correctly.
Solution 1: Constructors.
Problem 2: Now you cannot preallocate memory as in SLAB allocation since the constructor does an allocator call.
Solution 2: Placement new
Problem 3: Now the type system has led the compiler to assume your preallocated memory cannot change since you declared it const.
Solution 3: std::launder()
If it is not clear what I mean about placement new and const needing std::lauder(), see this:
https://miyuki.github.io/2016/10/21/std-launder.html
C has a very simple solution that avoids this chain. Use structured programming to initialize your objects correctly. You are not going to escape the need to do this with C++, but you are guaranteed to have to consider a great many things in C++ that would not have needed consideration in C since C avoided the slippery slope of syntactic sugar that C++ took.
- I absolutely agree - your chain of reasoning follows as well. It doesn't seem like it at first, but the often praised constructor/destructor is actually a source of incredible complexity, probably more than virtual.
- You need something like std::launder in any systems language for certain situations, it isn’t a C++ artifact.
Before C++ added it we relied on undefined behavior that the compilers agreed to interpret in the necessary way if and only if you made the right incantations. I’ve seen bugs in the wild because developers got the incantations wrong. std::launder makes it explicit.
For the broader audience because I see a lot of code that gets this wrong, std::launder does not generate code. It is a compiler barrier that blocks constant folding optimizations of specific in-memory constants at the point of invocation. It tells the compiler that the constant it believes lives at a memory address has been modified by an external process. In a C++ context, these are typically restricted to variables labeled ‘const’.
This mostly only occurs in a way that confuses the compiler if you are doing direct I/O into the process address space. Unless you are a low-level systems developer it is unlikely to affect you.
- Do you see all the concepts you had to describe here?
> Unless you are a low-level systems developer it is unlikely to affect you.
Making new data structure is common. Serializing classes into buffers is common.
- When it comes to programming, I generally decide my thoughts based on pain-in-my-ass levels. If I constantly have to fiddle with something to get it working, if it's fragile, if it frequently becomes a pain point - then it's not great.
And out of all the tools and architecture I work with, C++ has been some of the least problematic. The STL is well-formed and easy to work with, creating user-defined types is easy, it's fast, and generally it has few issues when deploying. If there's something I need, there's a very high chance a C or C++ library exists to do what I need. Even crossing multiple major compiler versions doesn't seem to break anything, with rare exceptions.
The biggest problem I have with C++ is how easy it is to get very long compile times, and how hard it feels like it is to analyze and fix that on a 'macro' (whole project) level. I waste ungodly amounts of time compiling. I swear I'm going to be on deaths door and see GCC running as my life flashes by.
Some others that have been not-so-nice:
* Python - Slow enough to be a bottleneck semi-frequently, hard to debug especially in a cross-language environment, frequently has library/deployment/initialization problems, and I find it generally hard to read because of the lack of types, significant whitespace, and that I can't easily jump with an IDE to see who owns what data. Also pip is demon spawn. I never want to see another Wheel error until the day I die.
* VSC's IntelliSense - My god IntelliSense is picky. Having to manually specify every goddamn macro, one at a time in two different locations just to get it to stop breaking down is a nightmare. I wish it were more tolerant of having incomplete information, instead of just shutting down completely.
* Fortran - It could just be me, but IDEs struggle with it. If you have any global data it may as well not exist as far as the IDE is concerned, which makes dealing with such projects very hard.
* CMake - I'm amazed it works at all. It looks great for simple toy projects and has the power to handle larger projects, but it seems to quickly become an ungodly mess of strange comments and rules that aren't spelled out - and you have no way of stepping into it and seeing what it's doing. I try to touch it as infrequently as possible. It feels like C macros, in a bad way.
- CMake is not a great language, but great effort has been put into cleaning up how things should be done. However you can't just upgrade, someone needs to go through the effort of using all that new stuff. In almost all projects the build system is an after thought that developers touch as little as possible to make things work and so it builds cruft constantly.
You can do much better in CMake if you put some effort into cleaning it up - I have little hope anyone will do this though. We have a hard time getting developers to clean up messes in production code and that gets a lot more care and love.
- I agree. Unless the project is huge, it's totally possible to use CMake in a maintainable way. It just requires some effort (not so much, but not nothing).
- If you are willing to give up incremental compilation, concatenating all C++ files into a single file and compiling that on a single core will often outperform a multi-core compilation. The reason is that the compiler spends most of its time parsing headers and when you concentrate everything into a single file (use the C preprocessor for this), it only needs to parse headers once.
Merely parsing C++ code requires a higher time complexity than parsing C code (linear time parsers cannot be used for C++), which is likely where part of the long compile times originate. I believe the parsing complexity is related to templates (and the headers are full of them), but there might be other parts that also contribute to it. Having to deal with far more abstractions is likely another part.
That said, I have been incrementally rewriting a C++ code base at a health care startup into a subset of C with the goal of replacing the C++ compiler with a C compiler. The closer the codebase comes to being C, the faster it builds.
- > I never want to see another Wheel error until the day I die.
What exactly do you mean by a "Wheel error"? Show me a reproducer and a proper error message and I'll be happy to help to the best of my ability.
By and large, the reason pip fails to install a package is because doing so requires building non-Python code locally, following instructions included in the package. Only in rare cases are there problems due to dependency conflicts, and these are usually resolved by creating a separate environment for the thing you're trying to install — which you should generally be doing anyway. In the remaining cases where two packages simply can't co-exist, this is fundamentally Python's fault, not the installer's: module imports are cached, and quite a lot of code depends on the singleton nature of modules for correctness, so you really can't safely load up two versions of a dependency in the same process, even if you hacked around the import system (which is absolutely doable!) to enable it.
As for finding significant whitespace (meaning indentation used to indicate code structure; it's not significant in other places) hard to read, I'm genuinely at a loss to understand how. Python has types; what it lacks is manifest typing, and there are many languages like this (including Haskell, whose advocates are famous for explaining how much more "typed" their language is than everyone else's). And Python has a REPL, the -i switch, and a built-in debugger in the standard library, on top of not requiring the user to do the kinds of things that most often need debugging (i.e. memory management). How can it be called hard to debug?
- Unfortunately that Wheel situation was far enough back now that I don't have details on hand. I just know it was awful at the time.
As for significant whitespace, the problem is that I'm often dealing with files with several thousand lines of code and heavily nested functions. It's very easy to lose track of scope in that situation. Am I in the inner loop, or this outer loop? Scrolling up and down, up and down to figure out where I am. Feels easier to make mistakes as well.
It works well if everything fits on one screen, it gets harder otherwise, at least for me.
As for types, I'm not claiming it's unique to Python. Just that it makes working with Python harder for me. Being able to see the type of data at a glance tells me a LOT about what the code is doing and how it's doing it - and Python doesn't let me see this information.
As for debugging, it's great if you have pure Python. Mix other languages in and suddenly it becomes pain. There's no way to step from another language into Python (or vice-versa), at least not cleanly and consistently. This isn't always true for compiled->compiled. I can step from C++ into Fortran just fine.
- > Am I in the inner loop, or this outer loop? Scrolling up and down, up and down to figure out where I am.
Find an IDE or extension which provides the nesting context on top of the editor. I think vs code has it built in these days.
- Pip has changed a lot in the last few years, and there are many new ecosystem standards, along with greater adoption of existing ones.
> I'm often dealing with files with several thousand lines of code and heavily nested functions.
This is the problem. Also, a proper editor can "fold" blocks for you.
> Being able to see the type of data at a glance tells me a LOT about what the code is doing and how it's doing it - and Python doesn't let me see this information.
If you want to use annotations, you can, and have been able to since 3.0. Since 3.5 (see https://peps.python.org/pep-0484/; it's been over a decade now), there's been a standard for understanding annotations as type information, which is recognized by multiple different third-party tools and has been iteratively refined ever since. It just isn't enforced by the language itself.
> Mix other languages in and suddenly it becomes pain.... This isn't always true for compiled->compiled.
Sure, but then you have to understand the assembly that you've stepped into.
- >This is the problem. Also, a proper editor can "fold" blocks for you.
I can't fix that. I just work here. I've got to deal with the code I've got to deal with. And for old legacy code that's sprawling, I find braces help a LOT with keeping track of scope.
>Sure, but then you have to understand the assembly that you've stepped into.
Assembly? I haven't touched raw assembly since college.
- > And for old legacy code that's sprawling, I find braces help a LOT with keeping track of scope.
How exactly are they more helpful than following the line of the indentation that you're supposed to have as a matter of good style anyway? Do you not have formatting tools? How do you not have a tool that can find the top of a level of indentation, but do have one that can find a paired brace?
>Assembly? I haven't touched raw assembly since college.
How exactly does your debugger know whether the compiled code it stepped into came from C++ or Fortran source?
- >How exactly does your debugger know whether the compiled code it stepped into came from C++ or Fortran source?
I don't know what IDE GP might be using, but mixed-language debuggers for native code are pretty simple as long as you just want to step over. Adding support for Fortran to, say, Visual Studio wouldn't be a huge undertaking. The mechanism to detect where to put the cursor when you step into a function is essentially the same as for C and C++. Look at the instruction pointer, search the known functions for an address that matches, and jump to the file and line.
- Great article. Modern C++ has come a really long way. I think lots of people have no idea about the newer features of the standard library and how much they minimize footguns.
- Lambdas, a modern C++ feature, can borrow from the stack and escape the stack. (This led to one of the more memorable bugs I've been part of debugging.) It's hard to take any claims about modern C++ seriously when the WG thought this was an acceptable feature to ship.
Of course, the article doesn't mention lambdas.
- Capturing lambdas are no different from handwritten structures with operator() ("functors"), therefore it makes no sense castrating them.
Borrowing from stack is super useful when your lambda also lives in the stack; stack escaping is a problem, but it can be made harder by having templates take Fn& instead of const Fn& or Fn&&; that or just a plain function pointer.
- Convenience is a difference in kind.
Like, I'm not god's gift to programming or anything, but I'm decently good at it, and I wrote a use-after-return bug due to a lambda reference last week.
- Borrowing from the stack is definitely useful. I do it all the time, safely (truly safely), in Rust.
- They can, but I find in practice that I never do this so it doesn't matter.
- I'm glad, but my problem is with the claim that modern C++ is safer. They added new features that are very easy to misuse.
Meanwhile in Rust you can freely borrow from the stack in closures, and the borrow checker ensures that you'll not screw up. That's what (psychological) safety feels like.
- Lambdas are syntactic sugar over functors, and it was possible all along to define a functor that stores a local address and then return it from the scope, thus leaving a dangling pointer. They don't introduce any new places for bugs to creep in, other than confusing programmers who are used to garbage-collected languages. That C++11 is safer than C++98 is still true, as this and other convenience features make it harder to introduce bugs from boilerplate code.
- Why wouldn't it be acceptable to ship? This is how everything works in C++. You always have to mind your references.
- This is like writing an article entitled "In Defense of Guns", and then belittling the fact it can kill by saying "You always have to track your bullets".[1]
[1] Not me making this up - I started getting into guns and this is what people say.
- Exactly! This is my problem with the C++ community's culture. At no point is safety put first.
- Its worse. The day I discovered that std::array is explicitly not range/bounds checked by default I really wanted to write some angry letters to the committee members.
Why go through all the trouble to make a better array, and require the user to call a special .at() function to get range checking rather than the other way around? I promptly went into my standard library and reversed that decision because if i'm going to the trouble to use a C++ array class, it better damn well give me a tiny bit of additional protection. The .at() call should have been the version that reverted to C array behavior without the bounds checking.
And its these kinds of decisions repeated over and over. I get its a committee. Some of the decisions won't be the best, but by 2011 everyone had already been complaining about memory safety issues for 15+ years and there wasn't enough politics on the comittee to recognize that a big reason for using C++ over C was the ability of the language to protect some of the sharper edges of C?
- >Why go through all the trouble to make a better array, and require the user to call a special .at() function to get range checking rather than the other way around?
Because the point was not to make an array type that's safe by default, but rather to make an array type that behaves like an object, and can be returned, copied, etc. I mean, I agree with you, I think operator[]() should range-check by default, but you're simply misunderstanding the rationale for the class.
- std::array [] is checked if you have the appropriate build settings toggled, which of course you should during development.
The same applies to many of the other baseless complaints I'm seeing here, learn to use your tools fools.
- Good news! Contracts were approved for c++26 so they should be in compilers by like 2031 and then you can configure arrays and vectors to abort on out-of-bounds errors instead of corrupting your program.
Let no one accuse the committee of being unresponsive.
- Yeah, it's great that the C++ community starts to take safety in consideration, but one has to admit that safety always comes as the last priority, behind compatibility, convenience, performance and expressiveness.
- I eagerly await the day when they do away with the distinction between ".cpp" and ".hpp" files and the textual substitution nature of "#include" and replace them all with a proper module system.
- Good article overall. There's one part I don't really agree with:
> Here’s a rule of thumb I like to follow for C++: make it look as much like C as you possibly can, and avoid using too many advanced features of the language unless you really need to.
This has me scratching my head a bit. In spite of C++ being nearly a superset of C, they are very different languages, and idiomatic C++ doesn't look very much like C. In fact, I'd argue that most of the stuff C++ adds to C allows you to write code that's much cleaner than the equivalent C code, if you use it the intended way. The one big exception I can think of is template metaprogramming, since the template code can be confusing, but if done well, the downstream code can be incredibly clean.
There's an even bigger problem with this recommendation, which is how it relates to something else talked about in the article, namely "safety." I agree with the author that modern C++ can be a safe language, with programmer discipline. C++ offers a very good discipline to avoid resource leaks of all kinds (not just memory leaks), called RAII [1]. The problem here is that C++ code that leverages RAII looks nothing like C.
Stepping back a bit, I feel there may be a more fundamental fallacy in this "C++ is Hard to Read" section in that the author seems to be saying that C++ can be hard to read for people who don't know the language well, and that this is a problem that should be addressed. This could be a little controversial, but in my opinion you shouldn't target your code to the level of programmers who don't know the language well. I think that's ultimately neither good for the code nor good for other programmers. I'm definitely not an expert on all the corners of C++, but I wouldn't avoid features I am familiar with just because other programmers might not be.
- > C++ is very old, in fact, it came out in 1985, to put it into perspective, that’s 4 years before the first version of Windows was released
Nitpick, I guess, but Windows 1.0 was released in November 1985:
- Funny how silly Windows 1 looks compared to Mac OS 1. I wonder if it was the color support taking resources.
- > You can write simple and readable code in C++ if you want to. You can also write complex and unreadable code in C++ if you want to. It’s all about personal or team preference.
Problem is, if you’re using C++ for anything serious, like the aforementioned game development, you will almost certainly have to use the existing libraries; so you’re forced to match whatever coding style they chose to use for their codebase. And in the case of Unreal, the advice “stick to the STL” also has to be thrown out since Unreal doesn’t use the STL at all. If you could use vanilla, by-the-books C++ all the time, it’d be fine, but I feel like that’s quite rare in practice.
- When NIST released its summary judgement against C++ and other languages it deemed memory unsafe, the problem became less technical and more about politics and perception. If you're looking to work within two arms' length of the US Government, you have to consider the "written in C++" label seriously, regardless of how correct the code may be.
- Nothing is going to happen for the foreseeable future, at least in the parts of government I tend to work with. It doesn't even come up in discussions of critical high-reliability system. They are still quite happy to buy and use C++, so I expect that is what they will be getting.
- The government is still happily commissioning new software projects that use C++. That may change in a few years, and some organizations may already be treating C++ more critically, but so far it's been unimpactful.
- > Yes, C++ can be unsafe if you don’t know what you’re doing
I feel like I always hear this argument for continuing to use C++.
I, on the other hand, want a language that doesn't make me feel like I'm walking a tightrope with every line of code I write. Not sure why people can't just admit the humans are not robots and will write incorrect code.
- > Just using Rust will not magically make your application safe; it will just make it a lot harder to have memory leaks or safety issues.
You know, not sure I even agree with the memory leaks part. If you define a memory leak very narrowly as forgetting to free a pointer, this is correct. But in my experience working with many languages including C/C++, forgotten pointers are almost never the problem. You're gonna be dealing with issues involving "peaky" memory usage e.g. erroneously persistent references to objects or bursty memory allocation patterns. And these occur in all languages.
- Nice thing about Rust is not that you cannot write such code, it is you know exactly where you used peaky memory or re-interpreted something as a unsigned integer or replaced your program stack with something else. All of such cases require unsafe blocks in Rust. It is a screaming indicator "here be dragons". It is the do not press this red button unless you intend to.
In C and C++ no such thing exists. It is walking in a minefield. It is worse with C++ because they piled so much stuff, nobody knows on the top of their head how a variable is initialized. The initialization rules are insane: https://accu.org/journals/overload/25/139/brand_2379/
So if you are doing peaky memory stuff with complex partially self-initializing code in C++, there are so many ways of blowing yourself and your entire team up without knowing which bit of code you committed years ago caused it.
- > All of such cases require unsafe blocks in Rust.
It's true that Rust makes it much harder to leak memory compared to C and even C++, especially when writing idiomatic Rust -- if nothing else, simply because Rust forces the programmer to think more deeply about memory ownership.
But it's simply not the case that leaking memory in Rust requires unsafe blocks. There's a section in the Rust book explaining this in detail[1] ("memory leaks are memory safe in Rust").
[1] https://doc.rust-lang.org/book/ch15-06-reference-cycles.html
- C++'s design encourages that kind of allocation "leak" though. The article suggests using smart pointers, so let's take an example from there and mix make_shared with weak_ptr. Congrats, you've now extended the lifetime of the allocation to whatever the lifetime of your weak pointer is.
Rc::Weak does the same thing in Rust, but I rarely see anyone use it.
- Huh? What do you mean? The point of std::weak_ptr is that it's non-owning, so it has no effect on the lifetime of the pointed object.
- They're both problems, and forgotten pointers are more common in C, or C++ before 2011 (unique_ptr vs manual new/delete).
- What's worse in languages like Go, which I love, is that you won't even immediately how to solve this unless you have experience dropping down into doing things you just would have normally done in C or C++.
Even the Go authors themselves on Go's website display a process of debugging memory usage that looks identical to a workflow you would have done in C++. So, like, what's the point? Just use C++.
I really do think Go is nice, but at this point I would relegate it to the workplace where I know I am working with a highly variable team of developers who in almost all cases will have a very poor background in debugging anything meaningful at all.
- The author argues that if rewriting a C++ codebase in Rust makes it more memory-safe, that's not because Rust is memory-safe. What?
- That’s because the author thinks it’s the second system syndrome carrying the weight.
I think Rust is probably doing the majority of the work unless you’re writing everything in unsafe. And why would you? Kinda defeats the purpose.
- I would argue that rewrite in C++ will make it a lot better. Rust does have some nice memory safe features that are nice enough that you should question why someone did a rewrite and stuck with C++, but that C++ rewrite would fix a lot.
- Fresh codebases have more bugs than mature codebases. Rewriting does not fix bugs; it is a fresh codebase that may have different bugs but extremely rarely fewer bugs than the codebase most of the bugs have been patched out of. Rewriting it in Rust reduces the bugs because Rust inherently prevents large categories of bugs. Rewriting it in C++ has no magical properties that initially writing it in C++ doesn't, especially if you weren't around for the writing of the original. Maybe if there is some especially persnickety known bug that would require a major rearchitecture and you plan to implement this architecture this time around, but that is not the modal bug, and the article is especially talking about memory safety bugs which are a totally separate kind of thing from that.
- I think there is significant merit to rewriting a legacy C++ (or C) codebase in very modern C++. I've done it before and it not only greatly reduced the total amount of code but also substantially improved the general safety. Faster code and higher quality. Because both implementations are "C++", there is a much more incremental path and the existing testing more or less just works.
By contrast, my experience with C++ to Rust rewrites is that the inability of Rust to express some useful and common C++ constructs causes the software architecture to diverge to the point where you might as well just be rewriting it from scratch because it is too difficult to track the C++ code.
- You left out the full argument (to be clear, I don't agree with the author, but in order to disagree with him you have to quote the full argument):
The author is arguing that the main reason rewriting a C++ codebase in Rust makes it more memory-safe is not because it was done in Rust, but because it benefits from lessons learned and knowledge about the mistakes done during the first iteration. He acknowledges Rust will also play a part, but that it's minor compared to the "lessons learned" factor.
I'm not sure I buy the argument, though. I think rewrites usually introduce new bugs into the codebase, and if it's not the exact same team doing the rewrite, then they may not be familiar with decisions made during the first version. So the second version could have as many flaws, or worse.
- The argument could be made that rewriting in general can make a codebase more robust, regardless of the language. But that's not what the article does; it makes it specifically about memory safety:
> That’s how I feel when I see these companies claim that rewriting their C++ codebases in Rust has made them more memory safe. It’s not because of Rust, it’s because they took the time to rethink and redesign...
If they got the program to work at all in Rust, it would be memory-safe. You can't claim that writing in a memory-safe language is a "minor" factor in why you get memory safety. That could never be proven or disproven.
- My only objection to your initial comment was that you left out the main gist of the argument (your later paraphrase says the same as I did).
I'm not defending TFA, I'm saying if you're going to reject the argument you must quote it in full, without leaving the main part.
- Did you read what they wrote? Their point is that doing a fresh rewrite of old code in any language will often inherently fix some old issues - including memory safety ones.
Because it's a re-write, you already know all the requirements. You know what works and what doesn't. You know what kind of data should be laid out and how to do it.
Because of that, a fresh re-write will often erase bugs (including memory ones) that were present originally.
- That claim appears to contradict the second-system effect [0].
The observation is that second implementation of a successful system is often much less successful, overengineered, and bloated, due to programmer overconfidence.
On the other hand, I am unsure of how frequently the second-system effect occurs or the scenarios in which it occurs either. Perhaps it is less of a concern when disciplined developers are simply doing rewrites, rather than feature additions. I don't know.
- I won't say the second-system effect doesn't exist, but I wouldn't say it applies every single time either. There's too many variables. Sometimes a rewrite is just a rewrite. Sometimes the level of bloat or feature-creep is tiny. Sometimes the old code was so bad that the rewrite fully offsets any bloat.
- Is there anything new here? Aren't these the same talking points people have been making for the past few years?
- I write C++ daily and I really can't take seriously arguments how C++ is safe if you know what you're doing like come on. Any sufficiently large and complex codebases tend to have bugs and footguns and using tools like memory safe languages limit blast radius considerably.
Smart pointers are neat but they are not a solution for memory safety. Just using standard containers and iterators can lead to lots of footguns, or utils like string_view.
- The one thing I'll say here is age of the language really is and always has been a superficial argument; it's only six years apart from Python, and it's far less controversial of a language choice: https://en.wikipedia.org/wiki/History_of_Python .
Either way, it's hard not to draw parallels between all the drama in US politics and the arguments about language choice sometimes; it feels like both sides lack respect for the other, and it makes things unnecessarily tense.
- > Here’s a rule of thumb I like to follow for C++: make it look as much like C as you possibly can, and avoid using too many advanced features of the language unless you really need to.
Also, avoid using C++ classes while you're at it.
I recently had to go back to writing C++ professionally after a many-year hiatus. We code in C++23, and I got a book to refresh me on the basics as well as all the new features.
And man, doing OO in C++ just plain sucks. Needing to know things like copy and swap, and the Rule of Three/Five/Zero. Unless you're doing trivial things with classes, you'll need to know these things. If you don't need to know those things, you might as well stick to structs.
Now I'll grant C++23 is much nicer than C++03 (just import std!) I was so happy to hear about optional, only to find out how fairly useless it is compared to pretty much every language that has implemented a "Maybe" type. Why add the feature if the compiler is not going to protect you from dereferencing without checking?
- std::optional does have dereference checking, but it's a run-time check: std::optional<T>::value(). Of course, you'll get an exception if the optional is empty, because there's nothing else for the callee to do.
- I really don't like Object Oriented programming anywhere. Maybe Smalltalk had it right, but I've not messed with Pharo or anything else enough to get a feel for it.
CLOS seems pretty good, but then again I'm a bit inexperienced. Bring back Dylan!
- I believe most C++ gripes are a classic case of PEBKAC.
One of the most common complaints is the lack of a package manager. I think this stems from a fundamental misunderstanding of how the ecosystem works. Developers accustomed to language-specific dependency managers like npm or pip find it hard to grasp that for C++, the system's package manager (apt, dnf, brew) is the idiomatic way to handle dependencies.
Another perpetual gripe is that C++ is bad because it is overly complex and baroque, usually from C folks like Linus Torvalds[1]. It's pretty ironic, considering the very compiler they use for C (GCC), is written in C++ and not in C.
[1]: Torvalds' comment on C++ <https://harmful.cat-v.org/software/c++/linus>
- > Developers accustomed to language-specific dependency managers like npm or pip find it hard to grasp that for C++, the system's package manager (apt, dnf, brew) is the idiomatic way to handle dependencies.
Okay, but is that actually a good idea? Merely saying that something is idiomatic isn't a counterargument to an allegation that the ecosystem has converged on a bad idiom.
For software that's going to be distributed through that same package manager, yes, sure, that's the right way to handle dependencies. But if you're distributing your app in a format that makes the dependencies self-contained, or not distributing it at all (just running it on your own machines), then I don't see what you gain from letting your operating system decide which versions of your dependencies to use. Also this doesn't work if your distro doesn't happen to package the dependency you need. Seems better to minimize version skew and other problems by having the files that govern what versions of dependencies to use (the manifest and lockfile) checked into source control and versioned in lockstep with the application code.
Also, the GCC codebase didn't start incorporating C++ as an implementation language until eight years after Linus wrote that message.
- > find it hard to grasp that for C++, the system's package manager (apt, dnf, brew) is the idiomatic way to handle dependencies.
It's really not about being hard to grasp. Once you need a different dependency version than the system provides, you can't easily do it. (Apart from manual copies) Even if the library has the right soname version preventing conflicts (which you can do in C, but not really C++ interfaces), you still have multiple versions of headers to deal with. You're losing features by not having a real package manager.
- GCC was originally written in GNU C. Around GCC 4.9, its developers decided to switch to a subset of C++ to use certain features, but if you look at the codebase, you will see that much of it is still GNU C, compiled as GNU C++.
There is nothing you can do in C++ that you cannot do in C due to Turing Completeness. Many common things have ways of being done in C that work equally well or even better. For example, you can use balanced binary search trees in C without type errors creating enormous error messages from types that are sentences if not paragraphs long. Just grab BSD’s sys/tree.h, illumnos’ libuutil or glib for some easy to use balanced binary search trees in C.
- > There is nothing you can do in C++ that you cannot do in C due to Turing Completeness.
While this is technically true, a more satisfying rationale is provided by Stroustrup here[0].
> Many common things have ways of being done in C that work equally well or even better. For example, you can use balanced binary search trees in C without type errors creating enormous error messages from types that are sentences if not paragraphs long. Just grab BSD’s sys/tree.h, illumnos’ libuutil or glib for some easy to use balanced binary search trees in C.
Constructs such as sys/tree.h[1] replicate the functionality of C++ classes and templates via the C macro processor. While they are quite useful, asserting that macro-based definitions provide the same type safety as C++ types is simply not true.
As to the whether macro use results in "creating enormous error messages" or not, that depends on the result of the textual substitution. I can assure you that I have seen reams of C compilation error messages due to invalid macro definitions and/or usage.
- Where C macros provide functionality C++ classes and/or templates cannot is stringification of their argument(s).
For example:
#include <iostream> #define SQL(statement) #statement int main (int ac, const char *av[]) { const char *select = SQL(select * from some_table); std::cout << select << std::endl; return 0; }
- > Yes, C++ can be unsafe if you don’t know what you’re doing.
It is even if if you do
> But here’s the thing: all programming languages are unsafe if you don’t know what you’re doing.
But here's the thing, that's not a good argument because...
> will just make it a lot harder to have memory leaks or safety issues.
... in reality it's not "just". "Just makes it better" means it's better
- I'm not sure what I feel about the article's point on boost. It does contribute a lot to the standard library and does provide some excellent libraries, like boost.Unordered
- Boost is an awful whole with a couple very nice tiny parts inside.
If you can restrict to using the 'good' parts than it can be OK, but it's pulling in a huge dependency for very little gain these days.
- I'm old enough to recall when boost first came out, and when it matured into a very nice library. What's happened in the last 15 years that boost is no longer something I would want to reach for?
- C++11 through 17 negated a lot of its usefulness - the standard library does a lot of what Boost originally offered.
Alternative libraries like QT are more coherent and better thought out.
- Qt is... fine... as long as you're willing to commit and use only Qt instead of the standard library. It's from before the STL came out, so the two don't mesh together really at all.
- The safety part in this article is incorrect. There's a google doc somewhere where Google did an internal experiment and determined that safety c annot be achieved in C++ without an owning reference (essentially what Rust has).
- Terrible article.
> you can write perfectly fine code without ever needing to worry about the more complex features of the language
Not really because of undefined behaviour. You must be aware of and vigilant about the complexities of C++ because the compiler will not tell you when you get it wrong.
I would argue that Rust is at least in the same complexity league as C++. But it doesn't matter because you don't need to remember that complexity to write code that works properly (almost all of the time anyway, there are some footguns in async Rust but it's nothing on C++).
> Now is [improved safety in Rust rewrites] because of Rust? I’d argue in some small part, yes. However, I think the biggest factor is that any rewrite of an existing codebase is going to yield better results than the original codebase.
A factor, sure. The biggest? Doubtful. It isn't only Rust's safety that helps here, it's its excellent type system.
> But here’s the thing: all programming languages are unsafe if you don’t know what you’re doing.
Somehow managed to fit two fallacies in one sentence!
1. The fallacy of the grey - no language is perfect therefore they are all the same.
2. "I don't make mistakes."
> Just using Rust will not magically make your application safe; it will just make it a lot harder to have memory leaks or safety issues.
Not true. As I said already Rust's very strong type system helps to make applications less buggy even ignoring memory safety bugs.
> Yes, C++ can be made safer; in fact, it can even be made memory safe. There are a number of libraries and tools available that can help make C++ code safer, such as smart pointers, static analysis tools, and memory sanitizers
lol
> Avoid boost like the plague.
Cool, so the ecosystem isn't confusing but you have to avoid one of the most popular libraries. And Boost is fine anyway. It has lots of quite high quality libraries, even if they do love templates too much.
> Unless you are writing a large and complex application that requires the specific features provided by Boost, you are better off using other libraries that are more modern and easier to use.
Uhuh what would you recommend instead of Boost ICL?
I guess it's a valiant attempt but this is basically "in defense of penny farthings" when the safety bicycle was invented.
- > Just using Rust will not magically make your application safe; it will just make it a lot harder to have memory leaks or safety issues.
Even if we take this claim at face value, isn’t that great?
Memory safety is a HUGE source of bugs and security issues. So the author is hand-waving away a really really good reason to use Rust (or other memory safe by default language).
Overall I agree this seems a lot like “I like C++and I’m good at it so it’s fine” with justifications created from there.
- I think this is a case of two distinct populations being inappropriately averaged.
There are many high-level C++ applications that would probably be best implemented in a modern GC language. We could skip the systems language discussion entirely because it is weird that we are using one.
There are also low-level applications like high-performance database kernels where the memory management models are so different that conventional memory safety assumptions don’t apply. Also, their performance is incredibly tightly coupled to the precision of their safety models. It is no accident that these have proven to be memory safe in practice; they would not be usable if they weren’t. A lot of new C++ usage is in these areas.
Rust to me slots in as a way to materially improve performance for applications that might otherwise be well-served by Java.
- It doesn't mention the horrific template error messages. I'd heard that this was an area targeted for improvement a while ago... Is it better these days?
- Qualitatively better. C++20 'concepts' obviated the need for the arcane metaprogramming tricks responsible for generating the vast majority of that template vomit.
Now you mostly get an error to the effect of "constraint foo not satisfied by type bar" at the point of use that tells you specifically what needs to change about the type or value to satisfy the compiler.
- So Boost is dying off? Good to know.
- What’s a good (ie: opinionated) code formatter and unit test framework for C++ these days?
I just had a PR on an old C++ project, and spending 8 years in the web ecosystem have raised the bar around tooling expectations.
Rust is particularly sweet to work with in that regard.
- My go to for formatting would be clang-format, and for testing gtest. For more extensive formatting (that involves the compiler) clang-tidy goes a long way
- I think you meant for more extensive static analysis. Clang-tidy is really awesome. There is also Facebook's Infer.
https://fbinfer.com
- The only formatter is clang-format, and it isn't very good. Better than nothing though.
- Catch2 is great as a unit test framework.
Running unit tests with the address sanitizer and UB sanitizer enabled go a long way towards addressing most memory safety bugs. The kind of C++ you write then is a far cry from what the haters complain about with bad old VC6 era C++.
- The article says "I think the biggest factor is that any rewrite of an existing codebase is going to yield better results than the original codebase.".
Yeah, sorry, but no, ask some long-term developers about how this often goes.
- It depends on the codebase. If the code base deserves to be a case study in how not to do programming, then a rewrite will definitely yield better results.
I once encountered this situation with C# code written by an undergraduate, rewrote it from scratch in C++ and got a better result. In hindsight, the result would have been even better in C since I spent about 80% of my time fighting with C++ to try to use every language feature possible. I had just graduated from college and my code whole better, did a number of things wrong too (although far fewer to my credit). I look back at it in hindsight and think less is more when it comes to language features.
I actually am currently maintaining that codebase at a health care startup (I left shortly after it was founded and rejoined not that long ago). I am incrementally rewriting it to use a C subset of C++ whenever I need to make a change to it. At some point, I expect to compile it as C and put C++ behind me.
- I think, if one of the most prominent C++ experts in the world(herb sutter), who chaired the C++ standards committee for 20+ years, who has evangelized the language for even longer than that - decides that complexity in the language has gotten out of control and sits down to write a simpler and safer dialect, then that is indicative of a problem with the language.
My viewpoint on the language is that there are certain types of engineers who thrive in the complexity that is easy to arrive at in a C++ code base. These engineers are undoubtedly very smart, but, I think, lack a sense of aesthetics that I can never get past. Basically, the r/atbge of programming languages (Awful Taste But Great Execution).
- "Rust shines in new projects where safety is the priority, while C++ continues to dominate legacy systems and performance-critical domains."
the truth
- Hasn’t Rust been shown to be very fast, especially since it can elide a lot of safety checks that would otherwise be necessary to prevent bugs?
On legacy code bases, sure. C++ rules in legacy C++ codebases. That’s kind of a given isn’t it? So that’s not a benefit. Just a fact.
- > "while C++ continues to dominate ... performance-critical domains"
Why performance-critical domains? Does C++ have a performance edge over Rust?
- I am not sure C++ needs a defense. Especially after C++ 11 cleanup.
- I don't think there could be any purer of an expression of the Blub Paradox.
> Just use whatever parts of the language you like without worrying about what's most performant!
It's not about performant. It's about understanding someone else's code six months after they've been fired, and thus restricting what they can possibly have done. And about not being pervasively unsafe.
> "I don’t think C++ is outdated by any stretch of the imagination", "matter of personal taste".
Except of course for header files, forward declarations, Make, the true hell of C++ dependency management (there's an explicit exhortation not to use libraries near the bottom), a thousand little things like string literals actually being byte pointers no matter how thoroughly they're almost compatible with std::string, etc. And of course the pervasive unsafety. Yes, it sure was last updated in 2023, the number of ways of doing the same thing has been expanded from four to five but the module system still doesn't work.
> You can write unsafe code in Python! Rewriting always makes the code more safe whether it's in Rust or not!
No. Nobody who has actually used Rust can reasonably arrive at this opinion. You can write C++ code that is sound; Rust-fluent people often do. The design does not come naturally just because of the process of rewriting, this is an entirely ridiculous thing to claim. You will make the same sorts of mistakes you made writing it fresh, because you are doing the same thing as you were when writing it fresh. The Rust compiler tells you things you were not thinking of, and Rust-fluent people write sound C++ code because they have long since internalized these rules.
And the crack about Python is just stupid. When people say 'unsafe' and Rust in the same sentence, they are obviously talking about UB, which is a class of problem a cut above other kinds of bugs in its pervasiveness, exploitability, and ability to remain hidden from code review. It's 'just' memory safety that you're controlling, which according to Microsoft is 70% of all security related bugs. 70% is a lot! (plus thread safety, if this was not mentioned you know they have not bothered using Rust)
In fact the entire narrative of 'you'll get it better the second time' is nonsense, the software being rewritten was usually written for the first time by totally different people, and the rewriters weren't around for it or most of the bugfixes. They're all starting fresh, the development process is nearly the same as the original blank slate was - if they get it right with Rust, then Rust is an active ingredient in getting it right!
> Just use smart pointers!
Yes, let me spam angle brackets on every single last function. 'Write it the way you want to write it' is the first point in the article, and here is the exact 'write it this way' that was critiquing. And you realistically won't do it on every function so it is just a matter of time until one of the functions you use regular references with creates a problem.
- > In fact the entire narrative of 'you'll get it better the second time' is nonsense, the software being rewritten was usually written for the first time by totally different people, and the rewriters weren't around for it or most of the bugfixes. They're all starting fresh, the development process is nearly the same as the original blank slate was - if they get it right with Rust, then Rust is an active ingredient in getting it right!
Yes, this is a serious flaw in the author's argument. Does he think the exact same team that built version 1.0 in C++ is the one writing 2.0 in Rust? Maybe that happens sometimes, I guess, but to draw a general lesson from that seems weird.
- Python’s “there should be one obvious way to do it” slogan often collides with reality these days too, since the language sprawled into multiple idioms just like C++: for printing you can use print("hi"), f-strings like f"hi {x}", .format(), % formatting, or concatenation with +; for loops you can iterate with for i in range(n), list comprehensions [f(i) for i in seq], generator expressions (f(i) for i in seq), or map/filter/lambda; unpacking can be done with a,b=pair, tuple() casting, slicing, *args capture, or dictionary unpacking with *; conditionals can be written with if/else blocks, one-line ternary x if cond else y, and/or short-circuit hacks, or pattern matching match/case; default values can come from dict.get(k,default), x or default, try/except, or setdefault; swapping variables can be done with a,b=b,a, with a temp var, with tuple packing/unpacking, or with simultaneous assignment; joining strings can be done with "".join(list), concatenation in a loop, reduce(operator.add, seq), or f-strings; reading files can be open().read(), iterating line by line with for line in f, using pathlib.Path.read_text(), or with open(...) as f; building lists can be done with append in a loop, comprehensions, list(map(...)), or unpacking with [*a,*b]; dictionaries can be merged with {*a,*b}, a|b (Python 3.9+), dict(a,*b), update(), or comprehensions; equality and membership checks can be ==, is, in, any(...), all(...), or chained comparisons; function arguments can be passed positionally, by name, unpacked with * and \*, or using functools.partial; iteration with indexes can be for i in range(len(seq)), for i,x in enumerate(seq), zip(range(n),seq), or itertools; multiple return values can be tuples, lists, dicts, namedtuples, dataclasses, or objects; even truthiness tests can be if x:, if bool(x):, if len(x):, or if x != []:. Whew!
- But hey at least python forces you to use whitespace properly hinthinthint
- If there are as many typos in some code as are in the article, there would be a whole lot of segfaults, too