Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

C has aged better than Java though. While Java still pretty much expects that a memory access is cheap relative to CPU performance like in the mid-90's, at least C gives you enough control over memory layout to tweak the program for the growing CPU/memory performance gap with some minor changes.

In Java and other high-level languages which hide memory management, you're almost entirely at the mercy of the VM.

IMHO "complete control over memory layout and allocation" is what separates a low-level language from a high-level language now, not how close the language-level instructions are to the machine-level instructions.



Picking Java is a bad example, plenty of high level languages are AOT compiled and provide the same control over memory allocation as C does.

And Java is in the process of getting those features as well anyway.


What popular "high level" languages might we consider? Scanning various lists, you have bytecode based languages (Java, .NET-languages, etc), you have the various scripting languages (Python, Ruby, Perl, etc), you have languages compiled the JVM (Scala, Elixar, etc), you extension to c (c++, objective-c). It seems all those are either built on the c memory model with extensions or use an

provide the same control over memory allocation as C does.

But the argument in thread is about something or other eventually being lower-level than C, right? C++, objective-C, D and friends "high-low", provide higher-level structure on the basic C model. Which in most conceptions puts higher than C but we can put them at the same level if we want, hence the "high low" usage, which is common, I didn't invent it.

Basically, the flat memory model that C assumes is what optimization facilities in these other languages might grant you. Modern CPUs emulate this and deviate from it in combination of some memory access taking longer than others and through bugs in the hardware. But neither of these things is a reason for programmer not to normally use this model, it's a reason to be aware, add "hints", choose modes, etc (though it's better if the OS does that).

And maybe different hardware could use a different sort of overt memory. BUT, the C programming language is actually not a bad way to manipulate mix-memory so multiple memory types wouldn't particularly imply "ha, no more c now". But a lot of this is cache and programmers manipulating cache directly seems like a mistake most of the time. But GPUs? Nothing about GPUs implies no more C (see Cuda, OpenGL - C++? fine).


.NET based languages include C++ as well, and .NET has have AOT compilation to native code in multiple forms since ages.

Latest versions of C# and F# also do make use of the MSIL capabilities used by C++ on .NET.

Then if we move beyond those into AOT compiled languages with systems programming capabilties still in use in some form, D, Swift, FreePascal, RemObjects Pascal, Delphi, Ada, Modula-3, Active Oberon, ATS, Rust, Common Lisp, NEWP, PL/S, Structured Basic dialects, more could be provided if going into more obscure languages.

C isn't neither the genesis of systems programing, nor did it provide anything that wasn't already available elsewhere, other than easier ways to shoot yourself.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: