Hacker Newsnew | past | comments | ask | show | jobs | submit | garrisonhh's commentslogin

I think the above commenter is just pointing out how the fundamental conflict between these two things results in significantly lessened incentive for pharmaceutical companies to put money into researching new antibiotics


I extremely know nothing about this field. Would a naive approach would to have a coordinated “crop rotation” type tactic where all hospitals switched to a primary antibiotic every C years/months do anything?


Empirically, many bacteria seem to be able to acquire resistance in a way that doesn't significantly impact their fitness, meaning they can basically get new resistances and keep old ones for almost arbitrarily long, so the crop rotation idea would fail massively.

https://academic.oup.com/femsre/article/35/5/901/2680377


It has costs. Replication gets more expensive and slower, more error prone. Also bacteria exchanges resistance DNA among itself. Would be a cool vector to give them weekness.


Good to know, thanks for the explanation!


As a current researcher in the field I am perpetually annoyed by the overeagerness of AI research to make fantastical claims. Reading and extracting information from papers is a minefield, and we've learned to always at least A/B test the conclusions of any technique that is supposedly proven to be useful. Even foundational papers about basic concepts in LLMs, for example, can sometimes boil down to "this worked well on our cherrypicked tests"


People don't have the time, money, or energy. And they will continue to lose all three until there is no option but to figure out collective organisation


It definitely feels deliberate and yeah it is a feedback loop.


Preferential attachment is a natural phenomenon, but corporations have gotten really good at exploiting it. This is one reason I favor just breaking up businesses beyonda. certain size. The ostensible consumer benefits of scale are outweighed by the 2nd order costs of market distortions. Efficiency is nice but not worth the combination of rent-seeking and adversarial zero-sum strategies.


I hold some hope that it’s just an unexpected outcome of a complex system. otoh, that would male it more difficult to remedy.


There is almost nothing written for ubuntu that won't work on arch or really most current distros. When I ran arch I did have to get familiar with assumptions people make due to ubuntu, but this would never be more than patching a config file or configuring some environment variable. And those are useful and transferable skills.


hmm. thanks for sharing. that's the kind of stuff I don't want to do though. i already have my own software to write and use a wide array of tools. i can't afford to lose a morning on that kind of stuff. plus i want to be able to tear down and rebuild without a ton of custom config


Does type theory fit into whatever definition of semantics you're using? It is certainly reaching some incredible heights of usability with tools like Lean and closer to the mainstream with functional languages like Haskell or compilers for Rust. Verification as a topic certainly expands beyond that, 'semantic solving' has been around for a long time and produces some fascinating things


Yeah, when I was trying to think of more recent examples type theory came to mind, but I think "structured programming" has had more bang for the buck so far...


Yes but no. The actual values represented by the quantized bits don't use a representation akin to IEEE floating point, but they are able to act like floating point values due to mathematical transformations during propagation. The floating point values a quantized value corresponds to are chosen using some kind of precomputation depending on the quantization method


Personally I prefer the leetcode UI, but I find that AoC challenges are usually grokkable problems where leetcode seems to require you to have a ton of very bespoke data structure and algorithmic knowledge that simply doesn't have a purpose for me besides doing more leetcode


Yes, I think this is the major difference. LC are specifically geared to test DS&A knowledge: trees, graphs, DP, BFS, DFS, greedy, etc. AoC are more about iterating fast to solve it however.


I think I used all of these structures/algorithms for AoC. Post day-16 ends up with these often.


This kind of post is why I do not understand C++ programmers whatsoever. It's basically a demonstration of the absurd amount of time C++ takes away from you by being the moldy pile of shit that it is. You want better compiler checks and less freedom? Great, me too. Can we please shut the fuck up and use a language that actually supports that?


It simply has too many upsides and is way too popular to be replaced with either

- a simpler language that does less

- a less popular language

- a more verbose / restrictive language

If you want people to switch to something better, make something better. Its not enough to simply call out the issues (which is valid, but not useful anymore), and it entirely dismisses that replacement languages like D (small, less active ecosystem) and Rust (hugely verbose and equally bloated) do not actually service the entire range of C++ users.

If all the other compilers and languages were THAT much better, they would be used.

For a lot of apps, Rust has been that replacement, but not for all domains C++ is used in.

Why do people still use telephones when calling via whatsapp, telegram, element or discord exists? Why do people still drive shitty motorbikes when electric cars exist? Why do people still write by hand when you can type instead?

Because the replacement is not a complete replacement and does not actually work better for everyone.


I like the idea of Rust, the tooling, the package manager (far more better that what you can find in C++).

What you are saying is true: I've been writing high performance scientific code and desktop gui apps. I would love to use Rust for my projects, but it just doesn't cut it. The libraries I am using are very mature in C++, but the libraries in Rust to accomplish the same thing are still too immature to consider in my projects.


Out of curiosity, what are these libraries that are missing equivalent in Rust?

Also, can you not use these libraries from Rust?


Not the OP, however here goes a quick sample.

Anything related to HPC and HFT, CUDA, game engines (Unreal/CryEngine/Ogre3D/Godot vs Bevy), Qt/WinUI/MFC/VCL/FireMonkey/wxWidgets/KDE, COM/XPC/Binder, compiler frameworks (Graal/GCC/LLVM).

Yes, many of those could be used from Rust, some of them already are, provided there are bindings, then again it is the classical question if one wants to maintain bindings, or write the application they care about.


C++ is terrible language.


Care to elaborate?


>If you want people to switch to something better, make something better.

We already have C, which is better. It existed before C++. It will exist long after C++ is dead. If you don't over-architect and over-abstract your code you can be more productive in C than in C++ precisely because you avoid the kind of nonsense that this article (and the endless other articles about overcoming C++'s shortcomings and cognitive overheads) talk about. In the time you spend waiting for C++ code to compile, you can instead just write more code in C.

Look at all the articles that get posted here about C++. There was a post a day or two ago about trying to work around C++ compilation times. How long does it take to compile C projects? Not long at all, if you follow standard rules that have been well-known for decades like forbidding #include in header files.

>If all the other compilers and languages were THAT much better, they would be used.

Argument from popularity/Blub paradox. If C++ is so good why does it need to be constantly updated and extended? If you think C++23 is better than C++20, then surely you accept that C++23 can at least potentially be worse than things that already exist.

---

People have an attitude towards C. They think "oh but what about std::vector? I will have to write my own containers". But 90% of the time you will have good enough or even better performance with a few lines of code. Often you know statically the maximum size and can just allocate that at compile time, or you know before filling a collection what the size will be. In C++, most code dealing with vectors that I have seen pre-sizes the vector anyway. That could just be a 'malloc' or a static array. The result is that you compile in debug mode and your program actually works, you can use a debugger, everything Just Works. Programs run at almost full speed in debug mode, and debuggers work. It's glorious.

In C++, you spend a hundred thousand lines of code reimplementing standard library containers anyway because anything more complicated than std::vector is unsuitable for non-trivial programs, and then the only way your program will run is with optimisations on, because your 'zero-cost' abstractions are actually very high cost if you aren't compiling with -O3.

Oh yay I have a big standard library in C++. Thanks for saving me from having to use the old ugly BSD sockets API to do networking. I can use a nice modern networking API in C++. Oh... it doesn't exist? Oh well, I'll use a nice modern filesystem API in C++. Oh, it only was added recently and lacks basic functionality? Oh well, I guess in return for learning the 50 different ways of initialising a variable, and all the subtleties of argument-dependent lookup and glprxvalues and template deduction guides and template⟨T=std::enable_if⟨std::is_void⟨void_t⟨int⟩⟩::type⟩::value⟩ I will get the benefit of using a high-quality standard library that contains std::regex and std::random and std::chrono and std::thread and std::vector<bool> and std::unordered_map and all the other VERY high quality standard library facilities provided by the C++ standard library!

Oh they're all crap. Pity.


No, C is not better. C is much worse. C lacks basic primitives that allow programmers to do useful abstractions and let them concentrate on the business logic rather than on the low level details of every functions.

You maybe can write small program easier on C as long as you can fit everything in someone's head. And that's why all complex low level programs are written in C++ and not C. (With the Linux Kernel being an exception, because of its stupborn maintainer)

And what about the drawbacks mentioned on the article? 99% of programmers don't need to care about them. The reasons such article exists is precisely because some people love C++ and like to play with their language.

---

> If C++ is so good why does it need to be constantly updated and extended?

Uh? To adapt and evolve to a changing environment. Instead of stagnating and becoming irrelevant. If it is so great, why is the iPhone constantly release new models? Why do every maintained software release new versions? And that's why you should use C++ instead of C.


Why would you respond to my post but ignore all the points I made? Do you just want to rant?

I explained why it doesn't matter that "C lacks basic primitives...". C has the basic primitives. It lacks generalist containers, for example. But most of those 'generalist containers' are, as I already explained, just not very good. For example, take this post:

https://zeux.io/2023/06/30/efficient-jagged-arrays/

What do we see here? The guy starts off with basically 'normal C++' and what he has at the end if essentially something you could write in C. When you optimise C++, you end up getting C. He has two std::vectors, but they're pre-sized so they're essentially the same as writing 'unsigned offsets = calloc(vertex_count + 1, sizeof offsets);'. This is not any kind of significant boilerplate.

The low-level details are something you NEED to concentrate on. If you don't care about the low-level details, then you shouldn't use C or C++. That's the point of low-level languages: control over low-level details.

All complex low-level programs are written in C. C is used in every production operating system kernel. C is the language in which systemd is written. It's the language in which pretty much every production web server or network server is written in. It's the language used by pretty much everyone interfacing with hardware.

The Linux kernel isn't an exception. It is the typical case.

>And what about the drawbacks mentioned on the article? 99% of programmers don't need to care about them.

The drawbacks mentioned in the article are major issues. They regularly bite C++ programmers, and are always lurking. You can't just say '99% of programmers don't need to care about them'. The whole point of namespaces in C++ was that you didn't need to care about name clashes. The result of argument-dependent lookup is that instead of getting a linker error on a name clash, like you do in a sane language, you instead can get random weird functions being called, silently.

>Uh? To adapt and evolve to a changing environment. Instead of stagnating and becoming irrelevant.

You've missed the point. You say C++ is so wonderful, but it's one of the languages that today is changing the most in idiom and specification. Since C++11, we're constantly being told about the new 'proper' way of doing things in C++. They're adding new features then going 'oh actually that was a bad idea' in every new revision. If it were so great, why would it be changing so much?

None of its changes are to 'evolve to a changing environment'. In C++11? Sure, they added threads, they added atomics. Pretty hard to argue that it wasn't an evolution to adapt to the new multithreading world. But today C++ still doesn't even have a networking API standardised. It still doesn't have a decent filesystem API. It can't even adapt to the world that existed when it was FIRST standardised, back in 1996. How is anything they're doing adapting to the world of today?

---

Really none of what you said addressed anything I actually said.


> If C++ is so good why does it need to be constantly updated and extended?

You could make that argument about almost any technology (certainly in the IT world). C23 is out this year which will make it the 6th "official" version - obviously a much more slowly evolving language than most, but nonetheless still being fairly regularly "updated and extended".

There's plenty of things I don't care for in C++, but I can (and do) choose not to use those parts. The few times recently I had to work with pure "C" code felt, well, decidedly primitive - but there was no choice but to grin and bear it. Which works great for some contexts, but I'm not sure I can imagine tolerating it for large-scale application development.


C23 is a bad standard with a massive pile of misfeatures. The only reason there even is a C23 is that a bunch of C++ people have infiltrated the committee in an attempt to force changes into C23. For example, the guy whose website is 'thephd.dev' is a self-proclaimed Rust fanatic who wants to see the death of C. He is also on the committee in a leadership position. The C committee.

The problem is that it's a language that doesn't need to change much. It doesn't really need a committee. People that recognise this don't bother getting involved. That leaves it vulnerable to entryists.

You cannot pick and choose which C++ features to use in practice if you use any kind of library including the standard library.


Better get one of those old C compilers that are still written in C then.


C++ is one of the most successful programming languages in history. It makes all new languages look like a drop in the ocean comparatively.

Millions of professional C++ developers work on large scale mission critical C++ code that control everything from airlines to Fortune 500 companies to cars to nuclear reactors to military hardware. And new C++ projects are started every single day.

Most developers have zero clue on what it takes for a programming language to be successful in the real world. And they have zero clue on how much $ and time it would take to rewrite the billions of lines of C++ code out there that runs the world.


Imo, rewriting the code isn't the only prohibitively difficult problem. Replacing the very advanced development tools C++ has seems almost impossible, and so far no competitor to C++ has come very close, and it's a moving target. You can debug template instantiations over the GDB remote protocol, you can print out an AST and make queries on it (much richer than Tree Sitter), you can generate a perfect callgraph (understands templates, overloads, and macros) with no effort, you can get suggestions for source code optimizations at the IR bytecode level, every hex editor or whatever can demangle C++ symbols, you can print out the ABI of any polymorphic class or use a bin introspection tool to do the same in reverse, you can organize template compilation errors into one of several easily navigable GUIs, you can JIT C++ code in a REPL or a debugger and place breakpoints on templates or overload sets or exceptions, you can extend linters and compilers with plugins, and C++ has some of the most advanced static analyzers of any language (not just systems level ones).

The closest overall in tooling IMO is Rust, but it's currently missing a lot of the above, and very many C++ abstractions cannot be elegantly expressed in Rust due to limitations in generics, const functions, the orphan rule, and a lack of C++-like functor structs.


Yep agree. Most HN developers work on comparatively simple JavaScript CRUD business applications and have zero clue on what it takes to write the large scale high-performance software the world relies on to function.


> This kind of post is why I do not understand C++ programmers whatsoever. It's basically a demonstration of the absurd amount of time C++ takes away from you by being the moldy pile of shit that it is.

What? Exactly what led you to think that nonsense? A random blog post of someone misusing lambda expressions?

There are plenty of things in C++ to criticize, but a) you are not actually criticising anything at all and instead spewed just noise b) misusing a feature is not a problem of C++.


This isn't a random blog post and the techniques in the blog are actually used by the standard library and some pretty major C++ libraries such as range-v3 and fmtlib.

The author of the blog post is in fact one of the authors of fmtlib.

Getting libraries to play nicely with ADL is actually a huge pain in the ass and in fact the three major C++ compilers don't agree with one another on how to actually perform name lookup in a large number of cases.


Don’t waste too much time trying to convince the commentor, as they have a history of accusing people of “spewing noise” in threads about C++.


What a well thought out and considerate post! HN needs more people like you making such valuable contributions.


Honestly these are issues introduced by C++, and avoidable by using C.


You can say that about most general purpose compiled languages. I guess I don't see why I would use D for any particular compiled use case over something like zig or rust? and if I just want to script system tasks I'm going to use popular languages with popular libraries and full batteries like python


Regardless, you can easily find the ebook online which Cory Doctorow won't have any problem with


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: