AMD recently annoyed the gaming community - who are among the most susceptible to unnecessary upgraditis - by saying that friends don't let friends buy GPUs with more than 8GB VRAM. If you look at the Steam hardware surveys, that's probably true. An M-series Macbook can play most stuff that isn't current AAA. Proton let's you play virtually anything on Linux and it's incredible (plus the Steam Deck). There's no point buying a $1000+ card for gaming. You're better off subscribing to a game streaming service.
Was mod support that common back in the day? Morrowind was pretty revolutionary in that you could load the entire "level" into the Construction Kit and see how the professionals built the quests. A few other games were released with map editors (I remember Age of Mythology having one). I feel like the games that can be moddable are notable.
Otherwise servers have always been a problem for developers. Do you let people self host and run the risk of rampant cheating on random servers? Or do you centrally host and eat the cost? I do think that the option of self-hosting is important. For every counter strike there are tons of abandoned RTS games that have nobody playing any more.
To play Doom: The Dark Ages, a video card with raytracing and 8 GiB of VRAM is min spec. So AMD's advice is already out of date.
Cheating is a huge problem, yes. To solve it you need to implement Trusted Computing at the hardware, firmware, and OS level. In the short term, more and more games will follow the lead of Apex Legends and just ban Linux players, because the very flexibility of Linux that make hobbyists prefer it also enables rampant cheating.
In the long term, devices like Pluton will make the PC a locked-down platform and the whole question will be moot. Future PCs will just be Xboxes that can run Excel. User-created content, including mods and custom servers, might be re-enabled in such an era for some games provided there are enough protections against shenanigans (piracy, cheating in multiplayer).
Map editors and modding have always been pretty common for both turn-based and realtime strategy games, and the entire MOBA genre originated from RTS mods. Bethesda RPGs have active modding communities in part because they always need community-made patches to be playable. Doom and Quake and Unreal had very fruitful mod and fork ecosystems with offspring that went mainstream like Team Fortress and Counter-Strike. Several simulation games shipped with the Gmax 3D modeling program.
I think this is still survivor bias though, and the games that traditionally offered map editors still do. There are a lot of games nowadays that allow user-contributed levels/content if not full-on mods.
It was a little tongue in cheek, but this comment by Frank Azor has been circulated around (and taken out of context):
> Majority of gamers are still playing at 1080p and have no use for more than 8GB of memory. Most played games WW are mostly esports games. We wouldn't build it if there wasn't a market for it. If 8GB isn't right for you then there's 16GB. Same GPU, no compromise, just memory
Was mod support that common back in the day? Morrowind was pretty revolutionary in that you could load the entire "level" into the Construction Kit and see how the professionals built the quests. A few other games were released with map editors (I remember Age of Mythology having one). I feel like the games that can be moddable are notable.
Otherwise servers have always been a problem for developers. Do you let people self host and run the risk of rampant cheating on random servers? Or do you centrally host and eat the cost? I do think that the option of self-hosting is important. For every counter strike there are tons of abandoned RTS games that have nobody playing any more.