I absolutely am serious; a lot of games and software in general today demand far more system resources than they have any reasonable right to.
Don't give me "but the textures!" and the like either, optimize that stuff better instead. Whether it's Windows 10/11 or Call of Duty or Elite: Dangerous or Chrome or whatever strikes your fancy, software today has no business demanding the resources they do.
Lest we forget, the hardware we can buy today would have been considered supercomputers just a few years ago. You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
Well they were given theses right by the users who spend lot of money on having these system resources and are asking games to be as beautiful and complex we can (not all the users, i'm not the last to spend time in oldschool games, but a significant and heavy spending portion of them).
Business is exactly why most games dont spend an enormous budget on optimization today. It's not a requirement by the great majority of customers, it's quickly time and cost heavy, so the return on investment is pretty low.
Yes, i think even with infinite optimization budget a today triple A realistic rendering could simply not be possible on a too old computer in realtime.
I also think while it would really add value if background application like teams/slack/discord would be less resource heavy because they are open but not the main focus, when you play a high end video game it make sense to consider it's your main reason to use your computer at that time :)
If simulating and rendering a complete complex intractable realistic but imaginary world with today achievable level of detail seems mundane to you, it's far to seems like to me :)
No opinion about browsers and OS, today games are doing lot more of stuff valuable to most users than those of yesterday. I don't know enought about modern value of os and browser, exept empirically they do seems to crash a lot lot less than 20 years ago, but also syp a lot more on me :)
The priority of an AAA game developer is to provide as much graphic fidelity for a specific compute budget, not to consume the least compute for a specific graphic fidelity. If they "optimize that stuff better", the outcome wouldn't (and shouldn't!) be a lower usage of system resources but rather fitting in even more graphic details while still capping out all resources.
They do obviously have the reasonable right to demand all the system resources that are available, because a game is usually an immersive experience that is the only important thing running on the system at that time, and the only purpose of those greatly increased system resources is to be used for gains in visual quality - there's no reason to not try and use all of that compute power of what would have been considered supercomputers just a few years ago.
> You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
The fact that you're comparing browsing the internet with playing AAA games speaks volumes. Browsers are capable of making insane amounts of optimizations because the "geometry" of a website is (mostly) completely static, there's no physics, there's no sounds, there's no AI running client side, there's no game logic, etc. This means they get to cache 90% of the view and only update the changed portions of the screen.
Contrast that with a game, which has the entire view of the 3D world changing every 16ms when the user moves their mouse, has thousands of physical interactions happening (most likely at a higher framerate), is buffering and mixing sounds in a 3D world, is animating and loading large 3D assets in real-time, is creating photo realistic lighting in real-time, is handling all game logic and AI client side, etc. It becomes clear that the two fields, while both difficult in their own ways, don't overlap very much. Of course AAA games take a super computer to run. It's doing all that in 16ms, sometimes 7ms!
Plus, if you don't care about all the visual fidelity and stuff, most games allow you to turn a ton of that off. Games have never been mundane, whether we're talking about the original tetris or the remastered version of the last of us, they are pushing the boundaries of the hardware they run on to the limit to achieve incredible immersive experiences.
Not only that! They also have increasingly helped improve the state of the art rendering in offline renderers! We're seeing the improvements that games have been able to make to achieve real-time photo realistic rendering slowly make their way to large Hollywood studios. This allows the movies we watch to have higher fidelity CG, because the artists have quicker iteration times. And it reduces the compute load required for these massive CG scenes since they are using more optimized rendering techniques. Saving money, and our environment.
Lest we forget, these "mundane" games have led to huge breakthroughs in all sorts of fields because of their willingness to push the boundaries of our machines to see what's truly possible. As opposed to 90% of the software created today which runs orders of magnitude slower than it needs to because people can't or don't know how to write efficient software.
I absolutely am serious; a lot of games and software in general today demand far more system resources than they have any reasonable right to.
Don't give me "but the textures!" and the like either, optimize that stuff better instead. Whether it's Windows 10/11 or Call of Duty or Elite: Dangerous or Chrome or whatever strikes your fancy, software today has no business demanding the resources they do.
Lest we forget, the hardware we can buy today would have been considered supercomputers just a few years ago. You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?