Hacker News new | past | comments | ask | show | jobs | submit login

The bullet list near the end closely matches my own experience with C++. There's an inordinate number of features creating an even worse profusion of edge cases where they interact. Then the "solutions" for those edge cases usually add even more complexity. Worse, they force programmers to coddle their compilers. The ratio between what a C++ compiler will accept and what it will produce sane code for is huge. That's why every C++ codebase I've ever seen is full of code to do things that shouldn't be necessary, and goes through all sorts of contortions to avoid doing things that should be OK, lest the compiler spew out an un-debuggable mess.

I'm perfectly happy managing my own complexity in C, or avoiding it entirely in Python. C++, on the other hand, seems to have been designed by compiler writers for their own enjoyment and/or job security. Every other "systems programming language" from D to Objective-C to Go to Rust to Nim presents a more coherent face to the programmer.




> C++, on the other hand, seems to have been designed by compiler writers for their own enjoyment and/or job security.

Being a C++ compiler writer (Zortech C++, Symantec C++, Digital Mars C++) I can assure you this is not true at all.

As to why C++ is so complex, my opinion is it is because it was designed a long time ago, what is considered better practice in designing languages has moved on, and C++ is unwilling to let go of the old decisions.

(There's always some major and vocal user who has build their entire store around some ancient feature.)

For example, why does C++ still support EBCDIC?


> Being a C++ compiler writer (Zortech C++, Symantec C++, Digital Mars C++) I can assure you this is not true at all.

Yeah, after I wrote that I realized it wasn't quite right. C++ is designed by compiler-writer wannabes. Architecture astronauts[1] on standards committees. They think they understand how compilers should work, and that adding support for this or that should be easy. "You just need to..." is their favorite opening. I see plenty of this in distributed storage, too. "It's so simple, I'd do it myself, but it's not worth my time, you go do what I said." The C++ designers seem hung up on an abstract model of machines and compilers that's a poor match for any real machine or compiler ever, and the actual compiler writers have to bridge the gap. Thank you for your efforts, which are Herculean in an Augean-stables kind of way.

[1] https://www.joelonsoftware.com/2001/04/21/dont-let-architect...


You can assure us based on what? If you have insider knowledge or particular credentials please share.

So far it looks like you're ranting.


http://www.walterbright.com/

> Walter Bright is the creator and first implementer of the D programming language and has implemented compilers for several other languages. He's an expert in all areas of compiler technology, including front ends, optimizers, code generation, interpreter engines and runtime libraries. Walter regularly writes articles about compilers and programming, is known for engaging and informative presentations, and provides training in compiler development techniques. Many are surprised to discover that Walter is also the creator of the wargame Empire, which is still popular today over 30 years after its debut.

Granted it's his own site, but uh, seems legit..?


Thanks, but my reply was for notacoward. Unfortunately I can't edit or delete it any more.


Oh, er, sorry about that.


  "I can assure you this is not true at all."
Was quoted, not the user you are replying to.


He assures us based on him being Walter Bright. That’s good enough for me.


Walter Bright's response is the parent of the comment in question. This one seems aimed towards the top level commentor's response, which advanced from the pained venting in the first comment (which I understand and can sympathize with) to a much more assertive tone.


From context, the comment I was responding to seemed attached to the wrong parent and seemed aimed at Walter's comment instead.


Yes, sorry for the confusion. On mobile the final sentence in the quote looked like it actually belonged to notacoward.

My comment was aimed at notacoward.


> why does C++ still support

Stop right there! There is plenty of evidence that removing features from a language is fatal to adoption. Both Perl and Python have suffered from this.

Specifically for trigraphs (apart from these, EBCDIC support doesn't affect compilers on other systems) IBM have a vote and they voted not to remove it: https://isocpp.org/files/papers/N4210.pdf


How has Python suffered from this? They broke things going from 2 to 3... and it was still the fastest growing language from 2017. It clearly wasn't popular just because they removed features, but you can't say that removing things is fatal to adoption.


IIRC they did lose the final vote though, and trigraphs is one of the few features ever removed from C++.


I miss them - I enjoyed being able to write ??=include at the top of my files.


Hey, digraphs are still there, they are almost as much fun as trigraph!


THIS is the reason IMO too. C++ has taken on the very difficult task of remaining broadly compatible with C and with legacy features while at the same time has continuously evolved over the decades, incorporating whatever was the state of the art at that time, without new features breaking old code. That is not an easy task without increasing complexity.


The book "Design and Evolution of C++" is quite interesting in that regard.

For all its warts, C++ only got adopted inside AT&T and later by almost every C compiler vendor, because it just fitted on their existing toolchains.

Even lack of modules is related to that, C++ object files needed to look just like C ones.

Now that C++ is grown up and can live on its own, it needs to pay for the crazy days of its parties going out with C. :)


>The book "Design and Evolution of C++" is quite interesting in that regard.

I found that book very interesting in many regards. I had bought and read it several years ago (out of interest, though I have not worked on C++ professionally).

Stroustrup goes into a lot of details about the reasons for many design decisions in the language. While I'm aware that C++ has some issues, I was really impressed by the level of thought etc., that he shows in that book, when he talks about all the reasons for doing various things the way he did them.


Some might say the party never ended. :-)


> incorporating whatever was the state of the art at that time

State of the art or flavor of the month? For instance, the features from functional programming that C++ and Java recently (in the last decade) added weren't anything new. When functional programming started to become more popular was when their features started showing up in C++ and Java.

If people are concerned that your language is already to large than adding elements from other programming paradigms because they're suddenly what's hot doesn't seem like a great idea. It feels like some languages are chasing the crowd, which can lead to a messy language ("OOP is all the rage now? Our language is all about OOP! Oh, functional is all the rage now? Well, we just nailed on some functional features!").


Eh, C++ can hardly be criticized to pandering to the flavor of the month.

For example, lambdas were finally added only in C++11 even though while the STL had a functional flavor since the late '90s and sorely needed lambda expressions. Only after people went out of their way to build lambdas on top of macros, expression templates and whatnot, they were finally added to the language.


It's true that isn't easy.

It's also true that it may not be necessary.

What makes a good language? One that randomly accumulates state of the art ideas about programming without breaking old code, or one that gets out of the way and allows requirements to be expressed reliably and relatively simply?

Of course C++ is used because it's fast. It's fine in limited domains like DSP.

But what is the rationale for a language that whimsically accumulates new features every couple of years, while failing to deal with basic challenges like memory management?

It's not as if it's ever going to reach a critical mass and turn into Haskell.


> For example, why does C++ still support EBCDIC?

Because there are companies like Unisys and IBM that want to sell C++ compilers to their mainframe customers.


IBM can support it as an extension in their C++ compiler. No need to burden the rest of the community with it. It's not like C++ compiler vendors are shy about adding extensions :-)

Despite C++ supporting EBCDIC, I seriously doubt the overwhelming majority of string processing C++ code will work with EBCDIC anyway, because the programmers probably never heard of it, let alone tested the software with it.


> No need to burden the rest of the community with it.

yeah, the problem is when IBM goes with big checks to national standard bodies and complains "muuuuhh we won't be able to assure that the systems we sold you in 1970 will still work in 20 years if the C++ standard removes support for EBCDIC" and then these standard bodies write strongly-worded letters to the ISO commitee: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2009/n291...


I think that implying bribery here is dishonest. 6/7th of the authors of the paper are IBM employees and they are just voting according to their employer wills.


The Committee recently got rid of trigraphs (required for EBCDIC compatibility). IBM was strongly against the proposal but was finally outvoted. They do keep the functionality as a conforming extension in their compiler, but now that they have been removed from the standard, the language might evolve in ways that might make the extension non-conforming.


Trigraphs can be supported with the simple expedient of putting a filter in front of the compiler that converts the trigraph sequences to the corresponding character. It doesn't have to be in the compiler itself.

In fact, trigraphs were designed to operate this way.

That is, until the addition of raw string literals broke that.


> For example, why does C++ still support EBCDIC?

You might imagine EBCDIC was a thing of the distant past, and you'd be wrong. As of at least 2013 there was still production systems using EBCDIC being actively developed. In COBOL. And not just at IBM.


What support does C++ have for EBCDIC?


https://en.wikipedia.org/wiki/EBCDIC#Compatibility_with_ASCI...

Most programmers manipulate characters as if they were ASCII, and that code will break if presented with EBCDIC. The C++ Standard is carefully worded to not assume ASCII encoding.

C++ presumably supports other encodings, but I've never heard of another one that will work. There's RADIX50, but that will never work with C++, for the simple reason that there aren't enough characters in it.

https://en.wikipedia.org/wiki/DEC_Radix-50

You can also see the EBCDIC support in oddities like the digraph support.


Sure, it's a large and complex language that takes time to master. But I'm interested to hear examples of what you call 'profusion of edge cases'.

> The ratio between what a C++ compiler will accept and what it will produce sane code for is huge.

As is the case for any programming language.

> C++, on the other hand, seems to have been designed by compiler writers for their own enjoyment and/or job security.

C++ is designed by its standards committee... If you know anything about the struggles compiler writers have had with implementing the standard, you'd know the standards committee definitely does not consist of compiler writers! It's really cheap to summarize their efforts as motivated by advancing their own job security if you ask me... I can recommend you to attend a meeting or to read some proceedings to convince yourself otherwise.


> I'm interested to hear examples of what you call 'profusion of edge cases'.

For one example, look at the profusion of cases in type deduction, depending on whether one is dealing with a value, an array, a pointer, or one of two different references, and whether qualified as const or volatile.

One might argue that these cases are too prevalent to be called 'corner' cases, but that doesn't exactly help! In C++11 and C++14 there was the indisputable corner case where auto and template type deduction differed with regard to braced initializer lists, though in a rare case of backwards-compatibility-breaking, it has now been fixed [1].

Scott Meyers, for one, has given examples of particular cases in the use of braced initialization, especially in the context of templates, that can be considered corner cases in that they are probably not likely to arise very often in most of the C++ code that is being written for applications.

[1] https://isocpp.org/blog/2014/03/n3922

[2] Scott Meyers, 'Effective Modern C++', pp 52-58.


One fairly common example of a bad C++ design edge case is the “Most Vexing Parse” problem [0].

I frequently find myself constructing objects using the () syntax will produce parse errors as the compiler is expecting a function declaration. Then replacing the () with {} just fixes it. It’s really frustrating that bad design like this is just maintained as a stumbling block for new users, instead of being fixed.

[0] https://en.wikipedia.org/wiki/Most_vexing_parse


> C++ is designed by its standards committee...

When it comes to design, C++ is a good example of why having a benevolent dictator is better than a committee. I still think it's a huge mistake to not have a standard ABI and rely on C's ABI.


Definitely agree about the ABI. In C# it's a pleasure to write libraries for others and use other libraries whereas in C++ it's almost always a pain.


Standard ABI only makes sense if you are targeting a single system (.NET for C#). That's not the case for C++.


It would definitely making interop between C# and C++ much easier. It would make interop between C++ and any language much easier. The only way to do it today is to extern "C" everything. It's ugly.


How would exactly an library compiled for 32 bit big endian POWER interoperate with a program written for 64 bit little endian ARM?


This doesn't work with C either or does it? C has a reasonable ABI so maybe we should get to the point where exchanging C++ libraries is as easy as doing this with C


The point is there is no standard C++ (or even C) ABI because it is not possible to standardize one. At best you can standardize a platform specific ABI. Which is in fact what happens in practice on most platforms, but it is up to the platform mantainers and the C++ standard itself can't have nothing to say.

Regarding the C ABIs, they are the lingua franca for interchange between languages because, a) as C is semantically poor, it is the minimum common denominator, and b) it is often the OS ABI.


> I still think it's a huge mistake to not have a standard ABI and rely on C's ABI.

And rust is going down the same path. Looks like c will reign supreme for shared libraries for the foreseeable future.


Disclaimer: I don't program in C++ day to day, so maybe my experience is atypical.

Moves and rvalue references (and whatever a PR-value is) and even RVO scare me. They make me want to pass pointers around, because at least I know for sure what'll happen then. (And, funnily enough, C++ seems worse than dynamic languages for this -- more magic around function calls and returns than C or Python or JavaScript.)


scare me ... because at least I know for sure what'll happen then.

Which is just because once you learned how pointers behave. Similarly, if you'd just take the time to learn the basics of rvalues, moving, RVO, ..., you won't be scared by them anymore. Might thake longer than pointers, sure, but it's worth it.


Maybe this is just my experience, but it took me far longer to understand the subtleties of move semantics, rvalues, and RVO than to understand pointers and references in C++. And this is not even getting into “universal references” (which I don’t have a comfortable understanding of either)


There's a YouTube video [1] of a great talk by Scott Meyers: Effective Modern C++ (part 1). He covers universal references and how they relate to other kinds of references in either that one or Part 2 [2]. I found the videos very helpful.

Edit - I may have the wrong videos linked. It could be his talk on Universal References [3] that I'm thinking of. It's been a while.

[1] https://www.youtube.com/watch?v=fhM24zs1MFA

[2] https://www.youtube.com/watch?v=-7qwpuA3EpU

[3] https://channel9.msdn.com/Shows/Going+Deep/Cpp-and-Beyond-20...


They really aren't that bad... in fact, they're pretty useful. Source: your everyday C++ programmer


They’re great, really. Yes, it takes effort to master, but then you can generate faster, safer, generic code with much less effort.

Source: Another everyday C++ programmer. There are dozens of us. Dozens!


It's unfortunate when your codebase is C++03 only and intends to remain that way for compatabillity with other vendors. Yay for consumer electronics!


Out of curiosity what breaks compatibility?


My understanding is that there are partner venders who have old toolchains set up only for C++03 and thus would have to upgrade their toolchains to build our API if we upgraded.


C++ is a mess but it's one of the few languages that gives you low-level control of memory and fast code with zero or near zero cost abstractions. So for a certain class of application it's still the best choice. Music software, for example, is pretty much exclusively written in C++. I don't enjoy C++ the language very much but you can build some very cool things with it.

Personally I'm hoping Rust displaces it from most of these remaining niches but even if it does it will probably happen slowly.


> it's one of the few languages that gives you low-level control of memory and fast code with zero or near zero cost abstractions. So for a certain class of application it's still the best choice.

For a certain class of program, you mean. For applications specifically, the advantages you mention are barely relevant. Usually only small parts of a whole application need low-level control of memory etc. Those can be written in C, with the rest written in a cleaner higher-level language that interfaces easily with C (there are many such)

C++ is proof that a single language can't satisfy all needs. It tries to do the low-level stuff as well as C, and fails because it can't produce libraries that are as easy to link to. Then it tries to do all the high-level stuff as well as other languages, and utterly fails because it can't get away from its low-level roots. D, Rust, and Nim all make better compromises that suit just about all C++ use cases. Go and Pony do a better job for some but not others. I won't say there's no room for C++ in a sane computing world, but its legitimate niche is very small indeed.


>For a certain class of program, you mean. For applications specifically, the advantages you mention are barely relevant. Usually only small parts of a whole application need low-level control of memory etc.

I'd say Rust has a similar level of control. I just rewrote our longest build step (37 minutes on a normal build) in Rust. By having control over when things are allocated I could get it down to about 20 seconds. The previous software is written in Java.

If you want speed you need to choose C, C++ or Rust. If you want it safely, then Rust. I'd argue in my case that Rust was probably the fastest choice. As I probably would have copied more in C/C++. In Rust I can trust that my memory is intact.

I'd also choose C over C++ though. I find it's a much more manageable language. I never found it hard to make the right abstractions in it, except for maybe a lack of generics (which C11 kind of solves for).


We are starting to see that trend on GUI frameworks and game engines.

C++ is still there, doing what it does best, driving pixels around with maximum performance.

But the upper layer, exposed to app developers and game designers, is done in another language.


Very very slowly.

Only now embedded development is starting to accept C++, and C still rules there anyway.

Which means it took about 20 years to reach this point.

And still Rust will need to go through the same certification processes that C, C++, Ada and Java enjoy for such scenarios.


I used C++ for embedded CPU 68332 (25 MHz CPU) with 4MB of SRAM in ~1996 for DNA sequencer machine.

~100 + classes, single inheritance, 1,2, 3 Axis motor controls, CCD Camera, Laser, serial com channel, scripting engines, etc.

No template, no virtual functions. Worked very well at that time.

The compiler setup at that time is AT&T cfront generate C from C++ code ran in Mac and embedded C cross compiler generated the target code.

The classes are shared within company for different machines (biotech robots) to maximize code reuse.


Very interesting, thanks.

I got introduced to C++ via Turbo C++ 1.0 for MS-DOS, in 1993.

So if it was good enough for 640 KB max, with 64KB executables, it shouldn't be an issue in most micro-controllers, but the biggest issue is the existing dev culture.


Forgot to mention couple other design decisions:

  No new, delete operators in any C++ code. 
  ISR code was also in C++.

  All objects are statically allocated - with 4MB of SRAM, one can easily see why.   It allows tightly control memory usage by the developer. 

  All regression tests are automated.   There were test scripts for all functional HW/SW components.  Found one bug triggered  in 24.9 days time frame (31 bit timer counter wrap around for 10 milliseconds timer call) - from that point on - all firmware release pass 3 months continuous test on multiple systems before release.  


  Agree with your point: Dev culture matter a lot.   This was a mac (powerpc mac) base dev house.   C++ was the big thing in the SW (Mac) side of the dev team.    


  In my career, I worked on  15+ projects - most were embedded system projects.   Only two projects are C++.   The other project only small subset is C++.   On this project - 90% of code base running in target are C++ and full OO design and 80% of the classes reuse from other projects.


Thanks for sharing.

I eventually moved into Java/.NET stacks, but still follow on C++, as my to go tool when they need some unmanaged help.


Very interesting. Was the build environment all inside MPW?


I don't remember. Long time ago.... Likely just steps inside makefile.

Not really a big fan of the Mac at that time - It was before Steve Jobs came back and merge the OS with Next? The Mac was very unstable. I remember it crash 3-5 times a day for my daily tasks - editing, cross compile.


The C++ you wrote in 1996 is basically a different language from the C++ that you're encouraged to write today.


> The C++ you wrote in 1996 is basically a different language from the C++ that you're encouraged to write today.

We can differentiate between (a) what the language spec says, and (b) what various individuals advocate.

The code we write is generally constrained by (a), but we can usually substitute our own best judgment for (b).*

* Except when the people mentioned in group (b) have sway over the C++ standard.


Sounds like a sweet spot kind of project for C++.


There has been recent pushes to use Rust in embedded systems, but I agree it'll take a long time. Check out this blog [1].

[1] - http://blog.japaric.io/


Thanks.


What do you mean by embedded systems?

Here's an article from 1998 that gives an example of C++ being used in military avionics systems:

http://www.cs.wustl.edu/~schmidt/TAO-boeing.html

It's been used in civilian avionics for a long time, too. Not that it's necessarily the best choice in those environments, but "starting to accept" seems like a mischaracterization.


I mean the embedded systems where Assembly and C89 still rule, and there is very hard to change, because the problem of adopting anything else is cultural.

Basically, while there are projects being done in C++, and many companies are finally migrating from C to C++, the large majority is sticking with C.

If you prefer to listen to someone actually relevant in the C++ embedded community, here is what Dan Saks has to say about it.

https://youtu.be/D7Sd8A6_fYU

http://cppcast.com/2016/10/dan-saks/


The embedded world has accepted c++ for all but the smallest microcontrollers. I was using c++ for embedded development 6 years ago and I was late to the party. All that prevented adoption be me before that was the cost of RAM.


Depends on the embedded system. I've worked on embedded systems running embedded Linux, with megabytes to gigabytes of storage, for the last dozen years, using C++.

If you're still dealing with an 8051, I'll agree that you're less likely to have moved to C++.

One other thought: Embedded toolchains are typically set at the start of the (original) project. I don't remember ever seeing a compiler upgrade happen in an existing embedded code base. And some embedded code bases live for decades.


I fully agree with you, and given my preference for type safe languages, I find interesting that other languages are also an alternative, depending on the use case constraints.

However from a few CppCon and Meeting C++ talks, it appears that in such scenarios moving beyond C, is more of a cultural issue than technology one.


>>Only now embedded development is starting to accept C++

Well largely on RaspberryPi kind of platforms, which aren't even even embedded systems. More like miniaturized desktops.

Then there is just C and only C. C dominance there isn't going to be replaced anytime soon, if ever.


During 2017, BMW among other car companies, and Sony have migrated from C to C++ as their main language for embedded development.

https://archive.fosdem.org/2017/schedule/event/succes_failur...

https://www.autosar.org/

"Developing Audio Products with Cortex-M3/NuttX/C++11"

https://www.youtube.com/watch?v=T8fLjWyI5nI

Unless you consider their devices Raspberry Pi kind of platforms

Of course with companies like Microchip still focusing on Assembly and C89, C is going to stay around for a very long time

What is done in C and macros, can be safer done with C++ and constexpr templates and better optimized, problem is to change the culture of those companies.


> Well largely on RaspberryPi kind of platforms, which aren't even even embedded systems

no, for instance Arduino uses C++ as a main language. And an arduino pico has 2.5kb of ram... that's firmly in the "embedded" scale of things.


I feel like embedded development should just avoid c++ and go with a managed languages. I was hopeful abut go, but they kinda wrecked it for embedded.

The thing with embedded is you have two cases, hard real time and just don't care.


Embedded devs always care. Managed languages with non-deterministic GC will never be popular there.

What embedded devs end up needing are:

- ability to shove bits directly in and out of a particular memory address

- ability to write interrupt handlers with low and bounded latency (i.e. emit a different function pre/postamble)

- global mutable state


They go with Java in very specific cases, but you need big bucks to pay for something like PTC Perc Ultra, JamaicaVM or WebSphere Real Time.

Then there is microEJ as well, but their target is that performance critical code is anyway written in C and exposed to Java.

Also depending how embedded one might consider mobile phone hardware, there is Android and Android Things.

Then there are some OEMs still selling Basic, Pascal and Oberon compilers.

But in what concerns embedded domains where only Assembly and C get used, currently C++ seems to be the only one that they care to adopt. Specially given that it already enjoys the same kind of safety regulations (MISRA, DOJ and similar).

And not everyone mind you, C++ conferences keep having sessions on how to advocate to those devs.


I've seen some people using MicroPython combined with C in embedded devices. Or plain Python on embedded Linux doohickeys. Company I work for has a few embedded Linux devices running Python.


Robotics is pretty much exlusively in C++ as well.


I agree that C++ is bad but I actually find C much worse for projects bigger than a few thousands of lines. Reasons for this:

* lack of namespaces - all names with long prefix look the same to me

* just text based macros

* no generics

* error handling usually based on int constants and output function parameters - in big projects is hard to use them consistently without any type checking

* no polymorphism

* ton of undefined behavior (almost the same as C++)


All your complaints about C are valid, except I'd say defined but stupid behavior buried somewhere in a gargantuan language spec is effectively the same as undefined behavior.

The difference is, C lets you control how much baggage you carry along and C++ doesn't. If I want a higher-level abstraction in C, I can usually implement it pretty well using inlines and macros, void and function pointers. Will it be pretty? Hell no. Will it have all of the syntactic sugar that the C++ version does? Nope. But it will work and be usable and most importantly the complexity/risk that it carries along will be exactly what I asked for (because I know how to manage it). Using a feature in C++ means carrying along all of the baggage from its dependencies and interactions with every other feature.

If programming languages were cars, it's like the dealer saying I can't add one option to the base model. No, I have to buy the top-of-the-line model with dozens of features I don't actually care about and will never use, costing far more and oh by the way that model has door handles that will cut you. That's about when I go to the Python dealership down the street, or just stick with good old C.


> Using a feature in C++ means carrying along all of the baggage from its dependencies and interactions with every other feature.

Only if you use every other feature. Don't do that. Use the features you need (and understand well), not every feature. It actually becomes much like you say C is, except that you don't have to write the features.


> Only if you use every other feature.

Not true at all. A lot of the features have either an interface or an implementation driven by the possibility of combination with some other feature. The cognitive and/or performance costs remain even if that other feature isn't used. For example, it's easy to get mired in writing extra constructors/destructors and virtual hooha just because someone using your class might also use some feature besides the one you used yourself. I've seen that happen on many projects. The only way to avoid it seems to be to abandon most of what makes C++ different than C, at which point it would usually make more sense to start with C and add what you need.


> For example, it's easy to get mired in writing extra constructors/destructors and virtual hooha just because someone using your class might also use some feature besides the one you used yourself.

I don't think I've ever seen that. (In a library, sure. In application code, no.) If you know it's going to be needed (or you know it's very likely), sure. If not, every place I've worked in added the extra constructors when needed, and only when needed.

Extra destructors? Other than empty destructors, I don't think it's possible to create an "extra" destructor, because each class can only have one. (And, in the case of virtual destructors, adding them is good practice. But that's not "combination of features", it's part of the deal you sign up for when you start using polymorphism. (Though I guess you could describe it as the combination of destructors and polymorphism, which is true, but it's simple enough I have a hard time regarding it as out-of-control complexity explosion.))


> In a library, sure. In application code, no.

It has been a long time since I worked on an application so trivial that parts of it weren't broken out into separate libraries. Most often, those libraries end up being maintained by people other than their users, and the users always end up using the library in unexpected ways or contexts so these protective measures always become necessary.

Perhaps more to the point, I don't think there should even be multiple constructors each called in different situations that it takes several pages to describe. This complexity mostly exists because the people who defined C++ don't seem to understand the significance of value vs. reference semantics, as the OP also noted. Move semantics and rvalue references represent a tacit omission that the previous semantics were broken, but they just introduce even more non-intuitive syntax and another few pages to describe what happens in which situations. That's exactly the kind of spurious complexity that makes compiler writers want to go on killing sprees and folks like me want to avoid the whole morass.


Never really had a problems with the lack of namespaces actually (and not due to insanely prefixed names), if everything is properly seperated there just doesn't seem to be enough chances for name clashes: headers should have only things which really are publicly needed, sources should only include headers they need, and one of the design targets should be high cohesion/low coupling. Though it probably depends on the project.


> no polymorphism

C has plenty of polymorphism. The compiler just doesn't do it for you. In fact, C++ started out as C-with-classes since it was a pain to keep recreating the OOP-in-C boilerplate over and over. Besides, there are more kinds of polymorphism than virtual methods. You can be polymorphic with an array of function pointers.


>"C++, on the other hand, seems to have been designed by compiler writers for their own enjoyment and/or job security."

It's not what a compiler writer would want by any stretch. Having helped write a C++ compiler I can a test to that. I will agree that C is a nice language. It does exactly as you tell it.

The complexity I would say is what you get when you "design by committee"

Web standards have a similar problem they keep growing and getting more complex. Try to write a browser from the ground up these days.


Strongly agree on the inordinate number of features point. Which other mainstream OO language supports dynamic & static dispatch, single & multiple inheritance, interface inheritance via abstract classes, generic types and functions via templates, lambdas, type inference, and operator overloading? Roll that up with C compatibility, a 30 year history of evolution, and the complexity of commonly adopted libraries like STL & Boost, and there's simply no avoiding it. There's a lot of C++ out there, and I'm confident that will pay the bills for years to come for this 50 something 25 year C++ veteran!


I think that's spot on. I coded quite a lot of C++ in the early 2000s. Now I'm considering to jump onboard again. C++11 and successors are almost a different language.

More importantly, for my niche I don't see anything that can readily replace C++. Rust has very little support for scientific computing. Julia is great, and will replace my high level statistical inference code, but it's not designed to let me design low-level close-to-the-metal data structures. Scala has memory issues, which will be hopefully less problematic once value types are implemented in the JVM. OCaml and F# look interesting, I haven't evaluated these carefully.


If you are doing scientific computing, have you considered Fortran? The gfortran (GCC Fortran) compiler supports Fortran 2003, which has object-oriented features, and since Fortran 90 the language has had array operations and syntax similar to Matlab or Python with numpy.


I'm not a big fan of Fortran for my particular domain, which includes lots of strings (biological sequences). Here Fortran is not as quick. See for example the k-nucleotide benchmark:

https://benchmarksgame.alioth.debian.org/u64q/performance.ph...


I worked with a customer that was using .NET for DNA sequencing.

Not sure what tricks they pulled off regarding unsafe code and parallelism, but it was fully done in C#.


What are Scala's memory issues?


C++ is a language with a lot of features, and not all of them should be used in every code base. It's quite possible to write simple, portable, and relatively clean C++ code if you use only those features you need.

It's not the prettiest language by any stretch, but it's quite capable and fast and has excellent support across just about every platform.


I saw Stroustrup give a talk and he said exactly this - C++ is a toolbox of different paradigms that aren't meant to be combined. If you avoid frameworks that impose a particular paradigm and/or shield the parts of the codebase with different paradigms (for example when using Qt you really should have your "engine" running aside the Qt rather than embedding the code in the UI, extreme example I know but the same idea) you'll have a grab bag of different approaches for the specific problem you're having for zero cost. When people talk of "sticking to a subset" they are unknowingly doing exactly that (especially since everyones subset is slightly different).

There might be a "culture of complexity" in the community, but to remove the conflicting paradigms from C++ is to destroy what makes C++ useful. I don't believe C++ is complex in it's DNA, but highly experimental, overwhelming to newcomers and experienced developers alike (since you have to truly understand any feature before using it), and easily misunderstood. It requires more strictness in design and implementation than other languages, and isn't my first choice for anything that doesn't require high performance. But since I'm in game development and audio synthesis, it's often my only choice since nothing else hits that sweet spot of abstraction and performance.


> If you avoid frameworks that impose a particular paradigm and/or shield the parts of the codebase with different paradigms...you'll have a grab bag of different approaches for the specific problem you're having for zero cost.

Avoiding frameworks and libraries which use unwanted language features and paradigms is very hard. Once these libraries are integrated, it is nearly impossible to restrict a team from using said features elsewhere in the project. Every C++ developer has pet features and features they hate and will never use, but these sets are rarely compatible between developers.


just a contrary opinion. I read the c++ book in the very early 90s. someone told me it was the future of programming.

every single c++ shop I've worked at since has said, 'well, yes the language is a mess, but if you stick to a well controlled subset, its really pretty good'

and all of those shops, without exception, have dragged in every last weird and contradictory feature of what is a really enormous language. so I guess the 'sane subset' argument is ok in theory, but really not in practice.

i've actually seen some really* clever mixin/metaprogramming with templates. it was a total disaster, and in a different environment it could be a really great approach. i could never understand it in complete detail, but if C is a 38 that you can use to blow your foot off, C++ is a 20ga shotgun with a pound of c4 strapped to your head.


> and all of those shops, without exception, have dragged in every last weird and contradictory feature of what is a really enormous language. so I guess the 'sane subset' argument is ok in theory, but really not in practice.

To be fair, this happens with every language I've been associated with, even C. Just look at those people who do metaprogramming with the C preprocessor. It's madness!

I used to do that too, as a young programmer. It took about 10 years to grind that out of me. One advantage of us older programmers is we show how clever we are by writing amazingly simple and understandable code. :-)


> To be fair, this happens with every language I've been associated with, even C. Just look at those people who do metaprogramming with the C preprocessor. It's madness!

To be fair, C++ metaprogramming with templates makes C preprocessor look like an insignificant ant in front of a truck.


"dragged in every last weird and contradictory feature of what is a really enormous language"

This happens with every language that has a lot of features. You have to try them before you can form an opinion.


> This happens with every language that has a lot of features.

Seems like the key is to not have a lot of features, then.


But languages with not a lot of features trap you unless you never need that feature (or unless you're fine with implementing that feature yourself in what the language gives you).


Then we should be using FORTRAN 77 forever?


>C++ is a language with a lot of features, and not all of them should be used in every code base.

That's not up to the individual coder coming later to a codebase. Or wanting to use a library that enforces those features, etc.

And the design of the features can impact how other features are implemented, even one doesn't use them.


"Sticking to the features you need" is pretty hard to implement. Either you end up with a stale code base that doesn't use new and useful features. Or you start using new features and only later realize the headaches they may cause. Neither situation is desirable.


Regarding D, it looks like that, but any big code base enjoys meta-programming (mixins), templates.

They have a ton of warts regarding annotations, usually worked around by using templates, because on that case they are inferred.

The semantics of shared are still being worked on.

The way const/immuatable works, makes some devs just give up and remove them from their code.

I can equally tell some Objective-C issues.

Yes, in general they are better than C++, but not without their own warts.


> The way const/immuatable works, makes some devs just give up and remove them from their code.

That's true. The thing with D const/immutable is they are transitive, and the compiler means it. It's not a suggestion.

The advantage of enforced transitive const/immutable is, of course, is it's foundational if you want to do functional style programming.


Not to do the c vs c++ thing, I too prefer c. For over 20 years actually I have preferred c to c++; all the while learning and enjoying other languages like rust. c still is missing some crucial stuff in my opinion-- there are times where I wish generics existed, and when you tie that to a trait system like in rust it's really a pity to not have something like that in c. Couple that with really great package managers for libraries in other languages, and a coherent way to use the libraries, makes one long for more when working in c; it's really missing great and easy code sharing that so many other languages have at their front and center.


> there are times where I wish generics existed

They kinda do...


> C++, on the other hand, seems to have been designed by compiler writers for their own enjoyment and/or job security.

If that were the case, the Edison Design Group (https://en.wikipedia.org/wiki/Edison_Design_Group) wouldn't exist. It exists because compiler writers don't want to have to deal with parsing C++.

(Then there's Dinkumware, which serves the same purpose for library functions.)


How is Rust less complicated than C++? I don't use it, but from what I read it seems to be even more complicated, and getting even more so with the myriad of features they are adding each release.


> How is Rust less complicated than C++?

In pretty much all senses of the word?

> getting even more so with the myriad of features they are adding each release.

It's far from adding a "myriad of features" with each release, and most of those it adds are library stuff, see for 1.23: https://github.com/rust-lang/rust/blob/master/RELEASES.md#ve...

And with respect to non-library features currently in-flight, by and large they are "playing catch-up" to C/C++ for compatibility or to fulfil use cases for which the language is not currently convenient or sufficient e.g. generics-over-values, const functions, allocators.

Rust has an upfront feeling of complexity in lifetimes and the borrow checker, but here's the ugly truth: pointer lifetime issues don't exist any less in C++, the only difference is the compiler doesn't help you with them.


> pointer lifetime issues don't exist any less in C++, the only difference is the compiler doesn't help you with them.

There's also the case though where the Rust compiler isn't helping you, but is just wrong.

Must of them are going to be fixed by non-lexical lifetimes soon though.


> There's also the case though where the Rust compiler isn't helping you, but is just wrong.

Bugs aside, compilers are not wrong, they may be too limited for what you're trying to do. Which is a different situation.


Being too limited for the intended purpose is still being wrong on some level. Not a bug, but still.

Granted, sometimes you don't care: stuff like `true?1:"foo"` is of type int an has value 1, but the compiler will rejected because of a type mismatch in the conditional—but you don't care because the code smell is too big and obvious to ignore.

Sometimes however you do care, if only a little: the value restriction in ML for instance prevent some function from being polymorphic because some side effect might break the type system. This forces you to eta-expand the function definition (no partial application, you need to write the arguments explicitly), which is a bit of a hassle in some cases.


> Being too limited for the intended purpose is still being wrong on some level. Not a bug, but still.

By that criteria, every single statically typed language is "wrong on some level", you'll always find something you want to do/express which a specific language's compiler will not accept. That's not very helpful.

> Sometimes however you do care

I'm not saying people should not care, caring about lexical lifetimes being a pain in the ass is perfectly sensible (there's a reason why non-lexical lifetimes are being implemented after all). I'm saying there's a gulf between "the compiler does not allow X" and "the compiler is wrong", and lexical lifetimes are the former.


Well, most compiler bugs are cases where the compile "is wrong" and "being too limited for what I'm trying to do" also sounds like a bug. So this reads to me as "aside from when they are wrong, compilers are not wrong" ;)


> Well, most compiler bugs are cases where the compile "is wrong"

Most compiler bugs are situations where the compiler allows stuff it should not or generates incorrect code.

> "being too limited for what I'm trying to do" also sounds like a bug. So this reads to me as "aside from when they are wrong, compilers are not wrong" ;)

Most every compiler is "too limited" to do some things, that is a big reason why dynamically typed languages are still a thing. For instance I can't tell Java to just take any object with an attribute "foo" despite it being the only thing I want to use (and I don't even care about its type) — bypassing the compiler via reflection aside. Do you think that's a bug in the compiler?


Hm ... I wouldn say thats a limitation of the compiler but of the language itself (no ducktyping). Definitely not a bug though.


Here's one example:

In C++, creating an instance of a class is fantastically complicated. The class must be initialized, and there is a zoo of different initialization forms: value, direct, aggregate, default, list, copy, etc. Which one foo {} invokes has been the subject of a spec bug. Some of these invoke a constructor, which is like a function, but isn't a function, multiplying the number of concepts involved further. Constructors suffer from bizarre syntactic limitations. They need a special syntax for handling exceptions. The form foo f(); famously declares a function instead of default initializing f. The [dcl.init] section of the spec is about 16 pages long and there are about another dozen about how constructors work in the special member functions section.

In Rust, there is exactly one way to create an instance of a struct: you provide a value for each of its fields.


> In Rust, there is exactly one way to create an instance of a struct: you provide a value for each of its fields.

As a user of both rust and and C++, I certainly wouldn't consider the the lack of default constructors in rust to be a good thing. Especially for std types like Vec it is really annoying to have to initialize it explicitly.

> The [dcl.init] section of the spec is about 16 pages long and there are about another dozen about how constructors work in the special member functions section.

That is a bad argument if you're comparing to a language that doesnt have a spec. Maybe a complete, exhaustive rust spec would be much longer than the C++ standard?


> As a user of both rust and and C++, I certainly wouldn't consider the the lack of default constructors in rust to be a good thing. Especially for std types like Vec it is really annoying to have to initialize it explicitly.

That is fair but orthogonal to the issue at hand, namely:

> How is Rust less complicated than C++?


A good, but difficult question - it's difficult to quantize complexity. Both languages have aim that their users can have much as much control as possible, even if it entails complexity in the language desing.

Perhaps one answer could be, that much of Rust's complexity arises from the concepts of 'borrowing' and 'lifetimes' and how they are encoded in the language.

In C++, complexity much of the complexity arises from the copy/reference semantics, and the possibility of overriding standard behaviour. You need to have wider context to understand local code. So you need to understand how the C++ works on quite a low level and you might need to know more specifics on your C++ codebase than might be the case in Rust.


It seems to have many fewer language-level features. No lvalue/rvalue distinction, pointers are not elevated to a language-level feature (only references), only a very limited exception mechanism (panic) that doesn't try to generalize to support general validation (rather general validation is done with Result which is a plain old datatype written in the language, though admittedly it relies on a macro for use - more generally this approach is also used for things like I/O, a lot more is done with plain old library functions written in the language rather than added to the language standard), the macro mechanism is much less of a special case than the C++ preprocessor, no "const" and associated language-level complexity (e.g. "mutable"), smaller and more consistent syntax (no comma operator, no sequence point rules), more unifying things from the start (e.g. traits as values) rather than ad-hoc conversion rules in the language later on.


FWIW

> Result which is a plain old datatype written in the language, though admittedly it relies on a macro for use

It does not. It relies on a special construct (!) or macro (try!) for convenience, but these are not necessary (a popular alternative is to use the various HoFs instead) and desugar to pretty trivial (if possibly repetitive) code:

    macro_rules! try {
        ($expr:expr) => (match $expr {
            $crate::result::Result::Ok(val) => val,
            $crate::result::Result::Err(err) => {
                return $crate::result::Result::Err($crate::convert::From::from(err))
            }
        })
    }


I don't think Result would be seen as an acceptable replacement for exceptions without the macro/special construct. But I should've been clearer.


(It's ? not !)


(You're right of course, not sure what I was thinking)


For one thing, Rust has had the benefit of learning from the evolutionary lessons that languages like C++ went through. Another point is that really arcane stuff that's applicable only to very few legacy systems need not be supported, and aren't, while C and C++ don't have the luxury of dropping such support.


Rust mostly avoided accidental complexity. The complexity it has is mainly from tackling a complex problem of providing memory safety and preventing data races at compile time.

In C++ a large chunk of complexity comes from legacy of being a C superset and having to preserve backwards compatibility with all of its old features and syntax quirks, and often surprising edge cases arising from interactions between different features.


Oh give me a break. Following this logic, why aren't you writing your code in English?

The real world is complex. Don't confuse hiding complexity with minimizing complexity.


>Following this logic, why aren't you writing your code in English?

Because slippery slope arguments never shone light on any situation.

>The real world is complex.

Which is neither here, nor there. We're talking about programming languages -- where you can be 10x as complex as another, but as long as you're both Turing Complete, you can't really do anything substantially more.

So, this is not about more power to handle "the real world", but about ergonomics. Which nobody ever said were C++ strong point.

>Don't confuse hiding complexity with minimizing complexity.

Well, we should also not confuse adding complexity with adding expressiveness.


See Brooks's "No Silver Bullet". C++'s complexity is accidental.


I thought of Brooks when I saw the title; as I mentioned in my other comment, I see a kind of amusing pendant to Conway's law here: a complex programming language will produce a complex codebase.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: