I don't think I've ever used more than a gigabyte of RAM programming. I could even do it on a Raspberry Pi. What exactly in your workflow uses 64GB of RAM?
C++ games programming. Build process uses several gigabytes, but running the game in debug configuration takes 50-55GB because we store every allocation at the moment. If I need to run my own servers or bake data I go over that easily.
When you say "store every allocation" do you mean you never release anything, or do you just mean you store info about every allocation? If it's the former, that sounds kind of crazy, is it common for game developers to do that? If it's the latter, you could always write it to disk (which is what malloc stack logging does).
We had a random memory corruption problem recently, so we started storing every allocation without releasing, to verify periodically. We do free up old memory, but only every 10 million allocations or so.
Maybe it's not super common for games programming, but it's definitely common to not use ref counted pointers or anything that could help you here.
machine learning stuff - whilst training datasets are usually cloud-deployed, dev data alone can use up a lot of RAM. I've recently started dumping my matrices to disk for dev work now. Or turn off Chrome and Firefox which turns out to be the largest memory sucks in my ubuntu machine
16g ram: perhaps 13g available to the user. If you run chrome/spotify/slack/an editor you're often left with only 8g useable.
ml work commonly uses data that is 8g+ -- and regularly 32g+ -- just for the data itself. Yes you can work on remote servers but it's convenient to be able to work and develop locally.