Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
C++: Is It Really a Cruel Joke? (2003) (duke.edu)
88 points by rhapsodic on July 5, 2018 | hide | past | favorite | 148 comments


Dr. Stroustrup took a lot of pain to ensure the language was useful and did not break backward compatibility. He also always maintained that if you don't use a feature you shouldn't pay for it (in terms of performance). Eventually, it became even more popular and there is now a whole organisation behind the international standards for the language. The book "The Design and Evolution of C++" helps one understand why some things are the way they are and sometimes it helps understand why some things exist in the language in the first place.

The language takes a lot of flak but I think it is quite good given the constraints it has and the modern version is an amazing leap forward in ease of use as well.


> The book "The Design and Evolution of C++" helps one understand why some things are the way they are and sometimes it helps understand why some things exist in the language in the first place.

I am not a C++ programmer, but I found that book extremely interesting and a pleasure to read. I think I have publicly said so before, but I wish there were books like this for more programming languages. There's a couple of fascinating HOPL talks on languages like Lisp and Lua, but due to brevity, they are not nearly as detailed and in-depth as this one.


I read that book around 1998 and was an instant convert. C++ has a steep learning curve, and does not stop you from getting yourself into a lot of trouble, but it does stand up to Bjarne’s original promise about performance.

IMHO the STL is one of mankind’s greatest accomplishments.


The STL doesn't always live up to the standard of "zero overhead abstractions." The list is not high performance, many implementations of hash tables never shrink (and used to have O(n) erase!), and deque is a bad joke (no chunk size control, plus laughable fixed chunk size on some common platforms).

Many high performance projects use vector but few other containers.

If the STL had a lesson to teach us it should have been "iterators everywhere, including for your own algorithms and container types." Instead most people learned "C++ has all the containers you need built in, throw away your performance tricks and let it call malloc a million times."


"zero overhead abstractions" is a principle of the language, not the library. The STL containers are clearly designed to be on the "safer" side than the performant side. You are right that, perhaps, we've extracted the wrong lesson from the STL but your version ("iterators everywhere, including for your own algorithms and container types") is still definitely the right message.

As an example, STL unordered-maps have fairly strict requirements around iterator invalidation and that results in indirection that do affect performance. It's pretty easy to make a "faster" unordered_map that tosses that requirement (as many do).


> Most high performance projects use vector and little else.

Right. And then it is actually a lot simpler to simply use pointer + size pairs [1] instead of std::vector. Changing to explicit allocation was the best decision I've made. I now find myself not longing for any C++ features anymore at all. I haven't needed anything besides a little allocation wrapper [2] and maybe a string-to-hash map since.

[1] Or n pointers + 1 size for parallel arrays, indicating that it's a bad idea to glue pointer + size in the first place.

[2] https://gist.github.com/jstimpfle/562b2c3e9fe537e378351bb9d5...


> string-to-hash map

string-to-int hash map


Endeavors that really need high performance tend to use other standard library implementations. I think EA has their own, for example.


The motivation for creating of EASTL was mostly to get rid of allocations. std::vector does not have an allocation problem. Other STL containers do.


C++ the language, maybe.

STL really is a cruel joke.

is it ok to mutate a data structure while an iterator to it is live? It depends, which makes it much less abstract and generic than it could be (and that people believe it is)

Error messages are useless. Compilers can and do (recently) help with that, but the main issue is the convoluted STL design in which every instantiation’s type name actually takes a full 80x25 screen to spell out.

STLs primary objects are individual iterated elements, which does abstract over pointers (as was stepanov’s intention) but are at such a low level of abstraction as to be onerous.

Modern C++ is slightly better.


Could be true, but a modern language such as Rust has its own problems:

https://news.ycombinator.com/item?id=16442743


What you’ve just linked to is a solution to the issue mentioned above. Rust makes a lot of guarantees, but sometimes those guarantees run afoul of certain required operations in different contexts. For this reason the language has an escape valve, unsafe, and the unsafe blocks are blindingly obvious in the code.

To say it’s a problem is misleading at best.


STL tried hard to abstract pointer manipulation semantic of low-level C and apply it everywhere whether it fit or not. This pointer heritage inevitably lead to misfeatures like invalidation of iterators under container mutations or the need to have access both to the iterator and its container to implement things like vector::erase().

Other languages and runtimes use the notion of an iterator or cursor that is alone enough to perform all looping-like operations on the container including efficient erase() and can cheaply (so it is ok to have that in production) or even with good optimizing compiler at zero cost provide protection against container mutations.


I have recently got upto speed with c++17 features and I want to explore more C++. So far I have used Java and Python to build webapps. Are there libraries/framework that can replicate the MVC in C++ as well? I think I can extend functionalities from there.


Not quite MVC but RxCpp exists and can help with designing MVP style code. (also known as event driven programming)


I always felt that if you wanted the same functionality as c++ in terms of performance, flexibility and power of abstraction, you'd end up with something as complex as c++.

Modern c++ shows there was/is room for improvement, but it does not fundamentally simplify the language imo.


Only if it need to have backward compatibility. Rust doesn’t so it can be vastly simpler.


I love Rust. Especially for emudev and embedded development (fields I would traditionally use C or C++). However, it still performs worse than C++ in most cases [1].

[1] https://benchmarksgame-team.pages.debian.net/benchmarksgame/...


A number of those C++ benchmarks are "cheating" by using SIMD intrinsics, which was only stabilized in Rust about two weeks ago.


1) Specifically which of those C++ programs? Innuendo is not OK.

2) Even with cheating in scare quotes, name-calling is not OK.


It's not innuendo when you consider the inherent problems with benchmarks. Once you have an algorithm, it's so hard to define what an objective benchmark is that you should assume the implementation is cheating, even if you wrote it.

I say this from personal experience; in one case I was doing timing studies to solve performance problems, and wound up fooling myself by measuring the wrong thing!

In this case, is it fair to use SIMD intrinsics? It depends on what you're trying to measure. I think that's why "cheating" is in scare quotes, because what would be cheating in one context might be useful information in another.

For instance, if C++ is providing SIMD intrinsics, it's going to beat other languages, and if I just want current performance statistics, that's the question I want to answer.

If the question is, "what's the overall quality of the code delivered by the compiler / optimizer" then using specific tricks doesn't give me a good answer.


> …when you consider the inherent problems with benchmarks…

dralley's comment does not do that.

There's nothing difficult here: simply say that those X of N leading C++ programs use SIMD intrinsics, when the corresponding Rust programs do not.

dralley might even say that SIMD intrinsics have been available in Rust nightly for years.

dralley might even say that someone has contributed a Rust program that does use SIMD intrinsics, but that program was slower than other Rust programs:

https://benchmarksgame-team.pages.debian.net/benchmarksgame/...

Perhaps if C++ [or Rust] is providing SIMD intrinsics that is not in-itself a magical silver bullet.


How is Rust simpler than C++?


Rust is a much smaller language and stdlib than C++ (though still quite large). Rust makes an effort to make many things explicit that are implicit in C++, such as numeric type conversions. Rust's traits are conceptually simpler than classes, especially if you want multiple inheritance. Rust's templates/generics are much closer to the core language than C++'s template language.

And then there are all of the safety features of Rust; while there is a learning curve, it is much easier to learn the concepts of lifetimes and ownership when the compiler is helping to enforce proper usage.


Are you talking about grokking the language and its more excotic concepts or writing code on it?

I have almost 30 years of C++ experience and about 6 months of Rust. My gut feeling is that the languages are equally difficult to understand but that the experience of writing code in them is very different.

I'm sure Rust will have its own set of surprises as I keep using it.

But I can't believe they will be nearly as bad as those that C++ comes with. ;)


If I had any real complaint about the execution of C++ it would probably just be the backwards compatibility with C. Don't get me wrong: I love C, and also I don't think it would've been possible for C++ to have gained so much relevance so fast had it not started as C with Classes. However, it does feel like a lot of kludges in C++ come from its legacy. Something that often confuses beginners is the sheer number of ways to do a thing, and some of them are not recommended to be used at all. With C diverging in incompatible ways from C++, the backwards compatibility has made less sense than ever.


If people wanted an object-orientated compiled language that didn't interlink with and offer a smooth transition from C, there were already alternatives (e.g. Modula).


The direction C++ went was pretty different from even what would've been considered 'object oriented' at the time. I'd say half of what made C++ special was how much stuff could happen purely at compile time.


That's why the sheer insanity of Objective C++ intrigues me.

I always speculated that Jobs went up to the engineers, "hey, we use a lot of Objective C in OS X, right?"

"Yes, sir, Mr. Jobs."

"Well, everyone else, namely Adobe, is using C++ and we need them writing apps for the Mac. We've gotta have those apps. So we're going to need to support that."

"Um, yes sir, Mr. Jobs, we'll get right on that."

He leaves, and they look around nervously. "He's pulling our leg, right?"


The book "The Design and Evolution of C++" helps one understand why some things are the way they are

As I've said quite recently elsewhere, C++ is an archaeological dig of a language. There are something like 4 major strata. If you would learn and use C++, it behooves you to pick a particular style, then stick to that. (RAII and smart pointers are very useful!)

There's something called the Taligent coding standards, which were once popular, then later castigated as turning C++ into "a poor man's Smalltalk." Small teams can write some dandy code using that style. Everything can fall apart at scale, however.

(EDIT: Here's an example from elsewhere in these comments: https://news.ycombinator.com/item?id=17463569 )


I am not a C++ fan. But D&E of C++ is one the best books I ever read. It explains a lot of the real world of the decisions they had to make. It was iluminating.


> There was this Oregon company - Mentor Graphics, I think they were called - really caught a cold trying to rewrite everything in C++ in about '90 or '91. I felt sorry for them really, but I thought people would learn from their mistakes.

I've talked to some of the people at Mentor Graphics who were there during that period. The company basically went from #1 in the industry to #3 or so during the course of the C++ refactor (and the EDA industry isn't exactly big). Bjarne Stroustrup showed up at the company now and then because the company was such a major early adopter. Inheritance chains were 5 or 10 classes deep. A full build took a week. The company hosted barbecues on weekends and invited the employees' families so they could see each other.

I only worked there somewhat later, so I just heard the stories from people who were there at the time, and only after I had been there a while. Take some old-timers out to lunch now and then, you'll learn a lot. I ended up leaving, I was more than a bit frustrated by the organizational culture and the build system my team used was by far the worst I have ever seen in my entire life.

But Oregon is an awesome place to live, the salary was good, and the hours were normal.


> Inheritance chains were 5 or 10 classes deep.

30 years later, I still run into the same problem regularly. I'm not sure why, but this seems to be an anti-pattern that everyone needs to learn about the hard way.


I blame java


Don't blame Java. Blame programming instruction.

If it's anything like when I was in school, as soon as the curriculum trots out its first object oriented language, you get a lecture about how is-a relationships are the greatest invention since the compiler, and deep inheritance hierarchies are both the most practical and the most morally righteous way to organize your abstractions.

(Meanwhile, ironically enough, I'm not sure I've ever heard a CS instructor even mention the Liskov Substitution Principle.)


I blame inheritance.


JWZ told a similar story about Netscape between versions 3 and 4 being rewritten in C++. The full story is in the book Coders at Work, but part of it appears here:

https://gigamonkeys.wordpress.com/2009/09/28/a-tale-of-two-r...

Of course this was mid-90s when both the language and especially the compilers were quite different from now (and the compilers very buggy indeed).


> the hours were normal.

What were 'normal' hours for a programmer in the 1990s? Did you guys work 9-5?


Isn't 9-5 still normal now?


9 AM to 5 AM? ;-)


I feel compelled to note that this page is from 2003-05-13, so I have no clue how much of its gripes are still valid in 'modern' C++ :)


The author didn't really complain about too much specifically, just generally complaining about the language. I guess it's hard to say whether or not the situation is better in the latest versions of C++ (though as someone who uses C++ an awful lot I would assert that it has gotten much much better).

That said, I think the complaint maybe comes down to coding style and architecture of the thing they're coding. They seem to make a joke we'd maybe more closely associate with Java than C++ these days. Also Microsoft's C++ style is awful. So if that's the only experience you have with C++ I would be hard-pressed to blame you for hating it.


Ya this is a whiny garbage post that looks like something I would have written when I was 18 had I been forced to use C++ for a school project (and probably could have been, given its year of publication). I'm not sure it adds anything to human discourse. It certainly isn't applicable to modern C++, or even the modern zeitgeist of pre-modern C++.


You can still do all those things in modern C++ in addition to all the stuff that get added every 4 years. The only thing that has improved dramatically is the tooling: compilers, static/dynamic analyzers.

> And, as I said before, every C++ programmer feels bound by some mystic promise to use every damn element of the language on every project.


And smart pointers had been a thing for a while at that point (though not part of the library)

https://www.boost.org/doc/libs/1_61_0/libs/smart_ptr/smart_p...


I wonder how things have progressed from a security/stability standpoint - my OS definitely crashes a lot less, I'm guessing because there is less C in it now...


Which OS? Linux and Darwin are still entirely C in the kernel, I believe. Windows is C++.



There was a post in that thread that I think confirms my suspicions - that most 'new' code outside the kernel is being written in C++/98 or 14.


the kernel is compiled in C++ mode and has a few classes AFAIK. Has been for a very long time - Win3.1 kernel already had some C++ in it.

The macos kernel driver interface, IOKit, is C++ too.


I use windows 10. Kernel is still Win32 C calls but I bet a lot of the services on top aren't.


Just because it's a C API doesn't mean it's not C++ underneath. Heck, Microsoft's C runtime is written in C++ nowadays.


"There are only two kinds of languages: the ones people complain about and the ones nobody uses."

-- Bjarne Stroustrup [1]

[1] http://www.stroustrup.com/bs_faq.html#really-say-that


While the OP article is indeed unhelpful and unconstructive, it's a pet peeve of mine to see people reference this quote (in the context of any language, not just C++) as a way of deflecting criticism. Not all languages that are used get complaints in equal proportion, and finding the constructive complaints is how we work towards making all languages gradually better.


> Not all languages that are used get complaints in equal proportion

It's because they are not used in equal proportion.


Presumably "equal proportion to their use" is implied.


There are plenty of languages in use that people genuinely enjoy using [1]. A lot of complexity in languages tends to be completely incidental. It's often not inherent in the problem the language solves, and it's just a design decision somebody made based on their aesthetic and experience at the time.

A lot of people seem to take pride in memorizing these quirks, but they're just that. There's nothing fundamentally interesting about them, and they're just mental clutter at the end of the day.

We should strive to have well designed languages that are optimized for developer experience. Accepting poor design decisions just keeps perpetuating the problem.

[1] https://insights.stackoverflow.com/survey/2018/#most-loved-d...


I can think of only two C++ "quirks" that meet the definition of "just mental clutter; nothing fundamentally interesting about them."

  1. Nested templates closing brackets conflicting with `>>` operator, necessitating `> >`. This was fixed in C++ 11.
  2. Syntax for declaring an automatic variable conflicts with the syntax for C function type: `Thing mything();`.
Those two certainly seem to be "unforced errors" where the language is simply stepping on its own feet for no good reason, and one of them hasn't even been relevant for over five years.

In all other cases in my experience, investigating the rationale behind a particular quirk has led to a fairly interesting reason; a difference between the heap and the stack, say, or the language giving you the option to not do some work that may be expensive and unnecessary. For example, beginners often are surprised and annoyed that `remove_if()` doesn't actually remove anything and they need to call `erase()`. But most STL algorithms work on a pair of start and end iterators and you can simply work with the new "past-the-end" iterator returned from `remove_if()` allowing you to combine or omit the calls to `erase()`. This is certainly quirky but its not "mental clutter": there actually is a fairly interesting reason for the API being designed this way, rooted in the zero-overhead principle.

In my experience that has been the rule, not the exception - taking the time to understand the "why" behind a given quirk usually results in being forced to admit to yourself, "yes, I see; that is the only way it could have been designed as a zero overhead abstraction. The 'simpler' alternative I had in my head would require some overhead to implement." I think that's the reason why so many people on this thread who have read "The Design and Evolution of C++" change their mind and come away with praise for the language - because it lays bare the logic behind many of those design decisions.

I would be interested to hear which quirks in C++ you view as mental clutter and/or design mistakes. As far as I can tell, most of C++'s usability problems come from it being too carefully designed and too backwards compatible. And after witnessing disasters such as Perl 6, I'm not sure "backwards compatible" is really a "mistake" per se.


Honestly, I haven't used C++ in ages, and I believe you that there great rationalizations for a lot of its behaviors. That's beside the point in my mind though. The end result is a very large and complex language that's very error prone.

C++ goes completely against the principle of least astonishment.There is a huge amount of mental overhead to reading and writing code in it. All that distracts you from the problem you're actually solving and directly translates into long development times, defects, and maintainability nightmares.

I don't think the complexity of the language ultimately justifies the goals it's trying to accomplish.


> after witnessing disasters such as Perl 6

OOC, how long ago did you witness this "disaster"? Perl 6 is doing very well, thank you.


This has got to be one of my favorite quotes about programming languages. ;-)


From the "interview" of "Stroustrup":

> ... You know, when we had our first C++ compiler, at AT&T, I compiled 'Hello World', and couldn't believe the size of the executable. 2.1MB

> Interviewer: What? Well, compilers have come a long way, since then.

> Stroustrup: They have? Try it on the latest version of g++ - you won't get much change out of half a megabyte.

So, for grins, I did, with the gcc7 port from macports, which is GCC 7.3.0.

-rwxr-xr-x 1 ssta staff 8968 Jul 5 10:40 hello

The C version, using printf instead of (gasp) std::cout, clocked in at 8432.


    #include<stdio.h>
    int main(){
            printf("Hello World");
            return 0;
    }
>5649591 -rwxr-xr-x 1 user user 8288 Jul 5 21:49 a.out

What am I doing wrong?


Did you use -O3 or some other optimizations?


No, just `gcc a.c`.


> It is sufficiently fascist that it more or less "forces" students to program with a certain discipline. Never mind that no real coder EVER writes programs with that particular discipline once they get out of diapers...

I wonder what would be the author's opinion of Rust


>> Never mind that no real coder EVER writes programs with that particular discipline once they get out of diapers...

This line appears to be a snide remark at class-based OOP, for which a regular criticism is that real-world projects rarely slot neatly into the sort of "Cow is-a Mammal is-a Animal" taxonomical hierarchy that is used to teach class-based OOP in school. Rust doesn't have classes or taxonomical hierarchies, and encourages struct-first POD design (similar to C), augmented by traits which provide shallow has-a relationships (composition) rather than deep is-a relationships (inheritance).


2001ish? Besides, it's a really tired debate/joke. Yes, it's hard to learn and has a bunch of misfeatures. But, like Javascript, it's widely adopted and in use on so many real projects that you're not going to get to rewrite.


If you only want to read the joke interview, here's a website with a better layout: http://harmful.cat-v.org/software/c++/I_did_it_for_you_all


No joke! No joke! You're the joke!

http://www.stroustrup.com/whitespace98.pdf

Generalizing Overloading for C++2000. Bjarne Stroustrup. AT&T Labs, Florham Park, NJ, USA.


I know C++ isn't really dying off, but what is the best platform-agnostic compiled object-orientated language these days?

I liked Borland Pascal, and I know Delphi is kind of ticking along, but I'd rather invest in a language that is growing.

Swift looks nice but still seems too Apple focused.

Don't want to start a war, just open to some tips on the ecosystem..


People often think of C++ as a Object Oriented Language. That's understandable, as it was the key feature back in the earliest days. OOP got hyped and over used. Now OOP is (almost) considered harmful in C++ community.

If you look at modern C++ libraries (like boost), you will see a lot of templates and free functions, and not a lot of inheritance or dynamic polymorphism.


> If you look at modern C++ libraries (like boost), you will see a lot of templates and free functions,

I mean, people were already calling for more templates and free functions in 1997. It's not modern by any stretch of mind, it's just normal C++.


FreePascal, if someday get enough support could work.

Rust maybe. Swift have at least a valid alternative to move to other platforms:

http://elementscompiler.com/

I wish Apple get smarter and put swift for windows and better linux.


If you liked Borland Pascal/Delphi then check out Free Pascal/Lazarus. If nothing else, it is growing (a bit too much if you ask me, but it isn't the monster that C++ is - yet). The main star of the show here is Lazarus which is basically a cross platform Delphi with support for the native widgets for each platform (on Linux you can choose between Gtk+ and Qt).

Another alternative is D, which is basically C++ minus a few warts, the option to have a somewhat faster compiler (all modern C++ compilers are very slow) but it wasn't made yesterday so it has accumulated its own cruft (most common being that most of the new "flags" for declarations being in the form of "@stuff" instead of just "stuff" even though some older stuff being just "stuff" so things look a bit messy - in a similar vein, the new supposedly best practice, especially for libraries, is for functions to be pure and nogc, but this isn't the default so you have to put the declarations everywhere). Still, it is the most obvious choice for someone who wants a better language than C++ without changing too much.

There is also Rust, but i don't know about it.


>I know C++ isn't really dying off, but what is the best platform-agnostic compiled object-orientated language these days?

C++ with Qt.


If you really want to keep getting things done today, especially on Windows, and not just learn something new, then stick with Delphi/Lazarus for now.

Every year I keep looking to see if a language will become available that will be able to replace Delphi, and there have been some close candidates, but nothing quite there yet. The close candidates right now are: AOT-compiled C# and Go. If you like the OOP in Object Pascal, then the type handling in Go will probably seem wonky to you, and as far as I know, AOT C# isn't quite ready, but I may be wrong on that front (it's hard to find information on the AOT progress with C#) and someone else can chime in with more information. I cannot understate what a game-changer AOT C# will be on multiple platforms, especially if they can keep the binary size down.


I was involved with a large OCaml project with a GUI which was compiled on Windows and Linux. We didn't try macOS but it would probably have been possible to port it there too. For the GUI we used Gtk (with a Windows-flavoured theme on Windows). It was written with emacs, built using a bunch of Makefiles, and shipped as a native binary on both platforms, with an NSIS-based installer on Windows.


A warning for anyone thinking of doing their cross-platform GUI this way: GTK only supports accessibility tools (such as screen readers for blind users) on Unix (via AT-SPI), not Windows or macOS. I'd suggest using wxWidgets or Qt instead. Of course, both of those use C++ as their native language, so creating bindings to other languages isn't as easy as for something C-based like GTK.


You didn't exclude C++, so I'll vote for that. Like you said - it isn't dying off and there are lots of C++ programmers out there so hiring is easier than with less popular languages.


I'm pretty fond of D and Nim.


You mentioned “compiled” without “…to native code”.

If that’s what you meant, C# fits quite well. The language is safe, very high-level but has easy ways to use C interop or pointer arithmetic, performance is adequate for many practical applications, asynchronous IO and multithreading are IMO best in the class, tons of libraries, good documentation, many users.

The main downside is limited options for cross-platform GUI. On windows it’s very good, on mobile platforms OK, for the rest of them (Linux esp. embedded, OSX) there’re no good options that I know of.

Another one is runtime size, current version is around 25-30 MB. No installation is required, but for some applications that’s still too much.


Native programmer for the last 16 years. D is the one I'm building with because it keeps the "everything is possible" ethos from C++ and has no big agenda.


If a GC or binary size is not a dealbreaker, Kotlin is hard to beat. Startup time fast when compiled to native.


D has potential


Yes, it is having potential for 11 years and will probably continue doing so.


Rust definitely is a good choice, if you can give up inheritance.


Wouldn’t classify Rust as OO though.


Why? It specifically chooses to drop one piece of OO, object inheritance, but it does have functional inheritance. It even has Deref which gives back a form of object inheritance.

It supports polymorphic functions like other OO languages.

While I find functional patterns more useful in Rust, that doesn’t prevent OO design where it’s desirable.


Rust has vtables and inheritance, that's fairly OO.


That (fake) interview at the end is hilarious.



Having spent the past decade working with python, I've always considered C to be a mystical black box for hardcore systems guys. I recently needed to write some high performance networking code so I decided to give it a shot. At first I found it painfully verbose and confusing, with the simplest operations requiring lines of code. After a couple days I found it to be rather refreshing, it encourages you to look at memory from a more basic perspective and efficiency just falls in place. Pointers are confusing and strict typing takes some getting used to, but they are awesome. It's great to be able to use the same memory with different types and structures, so better than copying things around. I still love Python, but I will be using C again when performance matters.


I stumbled into Nim recently and this thing "just works". Python-like syntax, performance like C. This is what C should have been, I'd say.

https://nim-lang.org/


I would encourage you to use Cython instead of Nim if you're looking for Python-like syntax combined with C performance.


This is just my situation, but I gave up on Cython when I started moving forward with it (distributed graph algorithms). I found that I had more things to learn. With Nim, it took me half a day (after seeing the language for the first time) to get a basic numerical process going and another few days for distributed data communication involving messaging and postgres/mongo (Nim has a modern JavaScript (ES6) promise/async/await like concurrency model that is powerful and succinct like the language itself)


I find that strange, but I concede you know your use case better.

I work on a very similar type of application that manages async workers who process large distributed NLP tasks. Writing it in Cython was extremely easy, because for the modules that have zero need for static typing, such as the part using async/await in Python 3, or when we supplement with gevent, I can just write those parts in plain Python and it’s quite a bit easier than Nim or Cython or whatever else, while still having great performance from those tools’ low-level implementation.

Then for the parts that do possibly benefit from static typing and compilation (unlike the async layer), I can have precise module-level control over what has a C-level implementation and if or how it interacts with anything in Python.

The inability to separate the two situations in Nim (as with many statically typed languages) just doesn’t work out well enough for my use cases.

In fact, I’d even go as far as to advocate that in today’s language landscape, if you want to write a new greenfield project in C or C++ for performance reasons, it’s unequivocally your best option to write the whole thing in Cython, and avoid what you might call “premature static typing optimization” by profiling and leaving the things with no bottleneck in Python.


Hi - thanks for outlining your case. My Cython know-how is limited to qualify any kind of comparison with Cython. Its just that my attempt with Nim went surprisingly smooth for me. Coming from some of the older languages I use and love, the reliability of the newer Nim is pleasing and coding is fun. I ended up eliminating all Python code from my back-end. Because Nim is statically compiled I can deploy pieces of it anywhere just like a compiled C program, without dependencies, and it means a lot in my particular situation.


> "the reliability of the newer Nim is pleasing and coding is fun."

Can you elaborate on the reliability part? I've spent a lot of time grokking Nim specifically to be able to make good judgments about whether there are use cases in which it would be a better choice than Cython, and from a reliability point of view I have not noticed anything that would distinguish Nim from any other language. I can agree that Nim's syntax is nicer than many other statically typed languages, though the language design has some warts with `result` and `discard`, etc. But I can't see any reason to believe it is 'more reliable.'

> "I ended up eliminating all Python code from my back-end."

While I can't know the reason for this in your exact case, generally this seems like a very suboptimal thing to do. Python has a much richer set of libraries, testing utilities, etc. It is a language with a huge community of users and developers, and much more likely to be a known language for someone new who joins the project. If a system was working well and someone proposed to refactor away a solid base language like Python, that would almost always be a crazy choice, regardless of any positive aspects of the targeted new language. It's similar to why you should rarely throw away old code that has meaningful tests. You can slowly refactor it little by little, but wholesale switching to something else is usually evidence of wrong engineering priorities, especially when the something else is a 'latest and greatest' kind of new language or tool, like Nim is.

> "Because Nim is statically compiled I can deploy pieces of it anywhere just like a compiled C program, without dependencies, and it means a lot in my particular situation."

This can also be done with Cython, using the options to embed an interpreter... and there are various other third party tools that allow you to create thick binaries for combined Python programs as executables, including runtimes and dependencies. To boot, you definitely should be managing the deployment of some binaries with proper dependency management practices. So really, if you're already using dependency management techniques for the binaries, the minor extra work to maintain Python environments and dependencies would almost always be pretty trivial, with a huge family of tools (pip, conda, pipenv, virtualenv, etc.) and endless tutorials on the community-developed and mature best practices for packaging Python programs.

I would be curious to know more details about a project where it was truly advantageous from a productivity and deliverability point of view to rewrite the backend to move from a stable and mature ecosystem like Python to a relatively younger and less mature system with Nim specifically to gain a benefit somehow related to ease of deploying pieces of the code to different locations. The details just don't sound like they could possibly be in favor of using Nim in a case like that.


> Can you elaborate on the reliability part?

What I meant by reliability is personal. The stuff I learned when reading and experimenting with Nim in one day sufficed to do practical things in my daily work. Any new information was found easily and I could keep developing (vs. some years ago when I was learning Haskell, after the first joy with the "cleanness" of the syntax, the systems programming part down the road became a bear. I had to go through yet another learning curve to digest that). May be reliability is not the word, but its just that Nim didn't let me down even though only a very limited time was invested to learn it for my purposes.

> Python has a much richer set of libraries, testing utilities ..

Yes indeed. If an off-the-shelf numpy package sufficed I would have stuck with it. For recent numerical work (this is where I tried Nim first, graph theory, matrices where you have observed non-standard sparseness, ie. not a Toepiltz kind "standard" sparseness, and these you use to your advantage by coding it yourself). Even if I find an exactly needed package from the community, I have to read the sources to know how it is coded. Subtle details in implementation cannot be fathomed from verbal documentation as it impacts rate of convergence, memory consumption etc. or perhaps a bug your use case uncovered! So during prototyping and validation you use python or whatever to help with the exploring, test cases etc. Once you know exactly how you need to structure the algorithms, I code With Nim and I will know exactly whats in it. Nim coding has been easy, and the performance unusually good.

> with a huge family of tools (pip, conda, pipenv, virtualenv, etc.)

In my point of view, I'd rather not carry these many things to develop and deploy to maintain and manage processes at many distributed sites. On my local machine, yes. The initial development install of Nim anywhere (takes 3 minutes, no admin) has all the tools for the build, unit test, package, integration test, and you can run compiled binary anywhere, on systems supporting just the key data/network dependencies that the application demands...traveling light.


I won't comment on Cython as I haven't personally used it much, but:

> though the language design has some warts with `result` and `discard`, etc.

Why do you consider these to be warts?


For `result`, the basic description from Nim’s example pages highlights why it’s a severe problem.

< https://nim-by-example.github.io/variables/result/ >

Even just needing to account for that mental gymnastics about declaring a new result variable is, I think, not forgivable.

The bigger issue though is that initialization of the return variable is implicit, which is inherently problematic. This is especially troublesome because type constructors in Nim are essentially always separate factory functions. So you have to remember to manually call a particular constructor or else `result` might be just an improperly initialized skeleton of your data type.

For example, I might have some type called MyType and a proc with return type of MyType. I explicitly don’t want the proc to initialize `result` to an empty MyType behind the scenes, for whatever implementation reasons about MyType (a common example is a type that ought to be initialized with the acquisition of a resource and should never exist in a partially initialized state in which the resource acquisition hasn’t been attempted yet, and could possibly fail later).

If I only want it to be initialized from a special constructor like mkMyType(), then in Nim, I have to code around this limitation by making it a void proc, and passing in an appropriately mutable reference.

In other words, to avoid possibly inappropriate return type initialization, I am forced to revert to poor C-style void functions all over that mutate placeholder inputs by convention, which undermines a lot of things Nim tries to do to improve clarity about pure vs impure procs.

I don’t have time to go into why discard is a bad design idea right now, but hope to come back and add more later.


> For `result`, the basic description from Nim’s example pages highlights why it’s a severe problem.

This is simply an explanation for newcomers. It's really not something that's a "severe problem", just something to be aware of.

> The bigger issue though is that initialization of the return variable is implicit, which is inherently problematic. This is especially troublesome because type constructors in Nim are essentially always separate factory functions. So you have to remember to manually call a particular constructor or else `result` might be just an improperly initialized skeleton of your data type.

This really isn't an issue when you can do this:

    import options

    type
      MyFile = Option[int]

    proc getFile(): MyFile =
      # Oh no, I didn't initialise it...
      discard

    echo(getFile()) # -> none[int]
You can also use `ref T` and achieve a similar effect: an explicit "empty" state. So there is no weird semi-empty state problem here.

I would really like to hear why you think `discard` is a bad design idea. I honestly cannot even imagine a reason as I consider this to be one of the best features of Nim.


This is not a reasonable answer regarding initialization, because you may not want to wrap everything in an Option type. Especially not for an obscure side effect reason like initialization, and then need to litter Option handling all over, which destroys a lot of information in your types. It would be like misusing Maybe in Haskell as if it was for exception handling. You could never write pure functions that get lifted to utilize Maybe. Instead you’d be forcing people to manually use Maybe everywhere, for all signatures. I had also already pointed out the Ref option in my original comment, as an example of exactly the type of anti-pattern that makes it a bad thing you constantly have to code around in Nim.

It’s not reasonable to suggest you have to code past this intrinsic limitation everywhere by muddying all your function signatures to take Option types and adding extra logic to pack or unpack values from Option types all over... to solve an initialization problem!


That's true and I wouldn't use the Option type for that either. There is a switch that warns about variables that are not initialized explicitly (including the 'result' variable) and an RFC to make this switch non-optional. https://github.com/nim-lang/Nim/issues/7917


Briefly re: discard — since void is not a proper type like say Unit in Scala, it makes void functions (especially with type parameters) very awkward in general. Adding the additional confusion that you can treat a value-returning function as if it was a void function with discard, and it further destroys clarity of the meaning of types, encouraging developers to shoe-horn side-effectful computation into value-returning procs assuming it will just be used with discard (so there becomes weird convention based designs where you think you’re supposed to use something for its return value, but actually the “intent” is for you to use discard).

Basically, discard & result make Nim a nice language if you are programming alone, and you know & intuitively understand the conventions being used or you can control manually wrapping stuff in Option for a bunch of type signatures or whatever and you can enforce it how you like it.

But when writing code for other people to interact with, the implicit return type initialization creates weird ways of coding around it that are not clear or common sense for other people, and then mixing void and the use of discard makes it super unclear when or why it’s useful to ignore the return type in some context, instead of it having been actually designed as a void function (and for this to have a proper type).

This comment on this Nim issue gives a good example of what I mean, < https://github.com/nim-lang/Nim/issues/7370#issuecomment-376... >.

But generally, I think it just speaks badly of discard-style thinking. Write void functions to communicate side-effectfulness. Don’t mix concerns about a side effect and an optional return value and assume people will get your meaning and know when to use discard. That is more like coding for the function author’s benefit instead of coding for readers, users or other contributors.


I'm not following you really. 'discard' is an enforced, explicit statement about that you throw away information/the result of a computation. That's against "nice if you are programming alone" as much as it can get.


Huh? What does it’s status as an explicit language construct have to do with the situations when it’s useful for design?

I’m saying when you program alone, you know when to use discard on your otherwise value-returning function. Other people don’t, and the use of the return type actually suggests the opposite. That you should intentionally invoke that proc for its return value.

> “That's against "nice if you are programming alone" as much as it can get.”

I don’t understand this claim. Nothing about the formal definition of a language is for or against being “nice if you program alone” — rather it is what patterns of usage does it encourage or facilitate.

It’s like “C++ without exceptions”. The formal implementation is just some factoid of the language, but the usage that arises around discard is a bad anti-pattern in terms of communicating intended usage and whether / when to rely on side-effects.


I agree that 'discard' is usually a code smell, but how would implicitly ignoring the result be any better? It wouldn't be better at all, and that means Nim's discard feature is rather well designed as it improves the status quo.


Why is implicitly ignoring the result the option you’re comparing with? Instead, create language features that encourage you to separate side-effectful functions from value-returning ones.

Also many languages use a very standard convention of assigning underscore to parts of a result value to be ignored, and discard has no clear advantages over this in my mind.


Why is that?


If you liked it you should try a language with actual strict typing like Ada or Rust.


> C++ is more insidious -- because it IS a superset of C, it sucks you in, and (like Pascal) it has been embraced by computer science departments everywhere

I've heard of this, but does it happen?

I did my degree back in the '90s at Drexel, I don't recall there being any prescribed language... I did some systems stuff in C, some AI courses in LISP, a concurrency course in Java, some stuff I don't recall in Perl, some math courses used Maple, and there was a bit of shell and some familiarization with the Solaris boxes, etc.

I don't think I ever actually took a course on a specific language, you were just supposed to RTFM and figure it out.


I think there should be a separate course for memory management w.r.t any language so that people can get a better perspective of what they are doing. A few books explaining the modern memory management principles would help as well.


I used to do a lot of C++ programming, but over time I found that using either C or a real high level language (Python, Ruby, etc) works better for most tasks.


yes, c++ is the cruelest most insane joke, on the same level of perhaps embedding malware inside a compiler and distributing it to the developers. no really, it was a nice try making an object-oriented language on top of c, but then why not build on top of the superior oo features of smalltalk or lisp?


Modern C++ is WAY better than C, IMO. I can never go back to C, having tasted C++14, and I work on embedded systems.


Seconded. I'd even go so far as saying that C++ is my favorite lang; it's C with the lavish furnishings of Java.

C is a space ship, Java is a plane, and C++ is an amphibious SSTO. It's messier in design and trickier to pilot, but you can do more with it!


Fortunately, there are more choices today. I use Rust for embedded systems.


Type safety, constexpr, template metaprogramming: I don't know why anybody does anything in C in embedded systems when C++11 is right there. Compile time programming is what it's all about.

Support has been slow, but even the non-GCC embedded toolchains have been improving substantially over the last few years.


Why do we as developers feel a need to constantly trash the things we don't like? If you don't like something, don't use it. Talk about the things you do like instead.


While I agree that it is good to not always dwell on the negative, there can be a lot of value in pointing out the deficiencies of programming languages.

Some folks will just use a language without thinking about its downsides. If you use language X for a long period of time, you can become blind to areas where it is wasting your time.

Again, I do agree that the tone could often be much more civil, but I would hate for experienced programmers to stop pointing out things that bother them about their programming languages. Much of the time it is not just about personal taste -- it is about real problems with the language that have a real cost.


I wasn't saying there's no place for constructive criticism or critiques. Those are definitely useful and important.

But that's not what this article is, and I think we all know the tendency I'm talking about for people to spend more energy and feel more comfortable trashing things than talking about what they like.


Because talking trash is fun.

Maybe the discussion will identify some problems with a thing. There's an infamous piece "PHP is a fractal of bad design" that I found very insightful.

But, honestly, it's mostly because it's entertaining.


[dead]


Since it seems you won't post anything else, we've banned the account. We're happy to unban accounts if you email hn@ycombinator.com and we believe you'll start commenting substantively.

https://news.ycombinator.com/newsguidelines.html


The only joke here is somehow this author thinks the success of your project is dependent on the language you pick


It's definitely an important factor, at least indicts lots of other factors.

- is the language suitable for the domain?

- is library/community support mature?

- are there enough developers?

- who in the company will support it? It's obvious from HN topics that we are obsessed in language topics.

- it also reflect some part of design philosophy, and it could be good or bad for your project depend on the case.


"Dependent" is a strong word, but tool/job fit is still important.

I've personally never grown comfortable enough with C++ to get to a point where, were I to be the one calling the shots, it would ever be my first choice. But I also recognize that it's dominant in certain spaces for a reason.

At the same time, I have a lot of sympathy for people who prefer C over C++. There's a lot of cognitive overhead involved in understanding the semantics of an object-oriented language, especially a big complex one like C++ or Java. And complex languages do have a tendency to beget complex implementations, even when you're working on a project that could be small and simple.


> it's dominant in certain spaces for a reason.

That reason usually is: "no other compiler were available" or "no other choices at the times".


I've been in the business for 15 years, I have never seen a successful product fail because it was written in language x or y. It fails for a lot of other non programming language reasons though


I have never seen a successful product that failed for any reason, ever. ;)


wordperfect


Exactly! If I want to write my AAA game engine in JavaScript and Haskell it still has the same chance of success as an engine in C++


Funny how many people believe that language choice influences success. I guess they're all wrong?

Sure, there are an uncountable number of other factors, but picking the wrong platform/language can be fatal.


Yes, but that goes both ways: choosing a popular language can be just as fatal as choosing an unpopular language.

If you're a "body shop", then choosing an unpopular language could be fatal, whereas if you're building a specific software product that needs to do certain things on certain platforms, then choosing the language becomes less of a popularity contest and more about how it can help you finish the product quickly with the highest level of quality and performance.


Anytime I see someone bashing Java/PHP/JavaScript on /r/programmerhumor (reddit) I challenge them to a coding contest. I use the language they just bashed, they use their ideal language. No takers yet.

I think, more often than not, language choice is like golf club choice. Different pros have their preferences, but a pro can play a good game with any set of clubs. A novice will blame the clubs for a poor game.


To be fair I don't understand why anyone would bother. I am commenting on a forum, I don't expect to be given homework.


And you can perform carpentry using a big rock instead of a hammer. That doesn't negate any downsides of choosing the rock.


I would say it's more like people are pooh-poohing certain brands of hammers as being inferior to TrendTech Hammers (R) which are "obviously" superior.


The winner will be the one who gets to pick the task for the programming challenge.


Definitely pick the right tool for the right job.

Using C (maybe C++ ?) may be great for micro controllers, a BIOS, a stage 1 bloat loader, OS kernel and device drivers.

If you're writing application software, and probably even server software, you should be using something higher level and a bit more abstracted away from the hardware. Not necessarily by much. But definitely more than C / C++.


I think it does and it doesn't. It really depends on what your end goal is. Writing a web application? Not sure it really matters much if you use Python/Django, Ruby/Rails, Java/Spring. Just use what you (or the developers) are most comfortable with. Trying to make a game engine for AAA games? Yeah your not going to get away with Ruby or Python.


> Writing a web application? Not sure it really matters much if you use Python/Django, Ruby/Rails, Java/Spring. Just use what you (or the developers) are most comfortable with.

It might matter if you used COBOL, though. Or even C++.


Joke aside, it does contribute a lot. On one hand ability to get programmers (for acceptable price) depends on it, and on the other some languages are simply not suited for some environments.

Would you believe I once saw backend service written in PHP4? And not a small one either. It was just the only language the 2 original authors knew... Apparently they had a bit of memory problems because substr (or similar) leaked a few bytes, which over a course of few months amounted to quite a lot of memory.


"Almost perfect"[1] disagrees with you.

[1] http://www.wordplace.com/ap/index.shtml


which part? I skimmed through it and it mostly seemed to be group interplay, direction and competition. This is what kills products, not language of choice.


The part where they stick to assembly while the competition doesn't.


Where's the triple-A games written in brainfuck then?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: