Most games don't need to change screen resolutions anymore which is the expensive bit since not only do you have to wait for the hardware to settle, you have to throw out basically everything in GPU memory and reset it all
Also, having to throw out basically everything in GPU memory is largely a thing of the past in the first place.
I still have this instinctual reluctance to change screen resolution in a game's setting screen, even though 99% of the time it's an instantaneous thing these days.
Modern games typically don't change the screen resolution at all. If there is a resolution setting then it usually is just for the internal render resolution and the final pass will scale that up to the native resolution of the display. Changing the screen resolution only made sense with CRTs where the display is actually capable of different resolutions unlike LCD displays where there is only one native resolution and non-native resolution needs to be resampled (either by the display, the GPU or the game).
> you have to throw out basically everything in GPU memory and reset it all
This is not an inherent limitation of CPUs but a part of Windows' exclusive fullscreen concept. Just another thing that was simply accepted as the way things are instead of being improved (until exclusive fullscreen went out of style).