But C++ is over 40 years old, and Java et al. displaced it in like five minutes. And that the footprint savings aren't worth much is pretty much the point. If you get 1GB/core and you use less, then you can't run more programs on the machine. The machine is exhausted when the first of RAM/CPU is, not when both are.
Nah. I'm an old man. I remember when Java 1.0 shipped. It got relatively little initial enterprise adoption considering it was from Sun who had a lot of enterprise contracts. Traction took years and was often aligned with adoption of Tim's crap hypermedia system, the "World Wide Web" which he'd created years prior but was just beginning to intrude into normal people's lives by the end of the 1990s.
A big factor is that Java was the entire ecosystem, you're getting a programming language (which is pretty good), a novel virtual machine (most of their ideas fell through but the basic thing is fine), dedicated hardware (mostly now forgotten), a component architecture, and a set of tools for them.
You're still on this 1GB/core thing which is the wrong end of the scale in two senses. Firstly, I've worked on systems where we'd want 1TB/core and so today that means you're buying a lot of CPU performance you don't need to get enough RAM because as you say, that machine is "exhausted" anyway.
But more importantly the scale for big products isn't dictated by RAM or CPU it's dictated by service provision, and at large scale that's just linear. Twice as much provision, twice the cost. Avoiding a GC can let you slash that cost. Cutting a $1M annual bill in half would justify hiring a Rust programmer though likely not a whole team. Cutting a $1Bn annual bill in half - which is much more like what Microsoft are spending on O365 - is obviously worth hiring a team.
It's not instant. GC tuning is basically instant. RIIR might take three, five, even ten years. So don't expect results tomorrow afternoon.
Rust is as old now as Java was when JDK 6 came out. But it's not just Java. Look at Fortran, C, C++, JS, C#, PHP, Ruby, Go, and even the late-bloomer Python - no popular language had an adoption rate as low as Rust at its rather advanced age. The trend just isn't there. It may well be that low-level programming will slowly shift to Rust an Zig, but there is no indication that low level programming as a whole isn't continuing its decline (in use, not importance).
> Avoiding a GC can let you slash that cost
But it doesn't, because RAM isn't the bottleneck in the vast majority of cases. It doesn't matter how linear costs are if RAM isn't the thing that's exhausted. That's why the use of manual memory management had been declining for decades.
At 1TB per core things still don't change because GC no longer has a high footprint cost in the old gen. You may use 15x RAM for the first 50 MB, but the overhead for the second GB is very small, thanks to the generational hypothesis: the older an object is, the less frequently it is allocated. The cost of a moving-tracing GC is propprtional to allocation-rate * live-set / heap-size per generation. When the live set is large, the allocation rate in that generation is low, which means that the heap size premium is also low.
> So don't expect results tomorrow afternoon.
Manual memory management is the older option, and it's been in decline for decades precisely because the savings in costs go in the other direction (not in all situations, but in most) due to the economics of RAM costs vs CPU costs. Without a marked change in the economics, the trend won't reverse even in another 50 years.