The other commenter makes it sound a bit more crazy than it is. "Intercept shaders" sounds super hacky, but in reality, games simply don't ship with compiled shaders. Instead they are compiled by your driver for your exact hardware. Naturally that allows the compiler to perform more or less aggressive optimisations, similar to how you might be able to optimise CPU programs by shipping C code and only compiling everything on the target machine once you know the exact feature sets.
Graphics drivers on Windows definitely do plenty of 'optimizations' for specific game executables, from replacing entire shaders to 'massaging/fixing' 3D-API calls.
And AFAIK Proton does things like this too, but for different reasons (fixing games that don't adhere to the D3D API documentation and/or obviously ignored D3D validation layer messages).
I don't know -- if that other commenter is correct, it does sound pretty crazy.
Improving your compiler for everybody's code is one thing.
But saying, if the shader that comes in is exactly this code from this specific game, then use this specific precompiled binary, or even just apply these specific hand-tuned optimizations that aren't normally applied, that does seem pretty crazy to me.
Finger printing based on shaders is quite rare, really most of the time we detect things like the exe name calling us or sometime, very rarely they will give us a better name through an extension. (unreal engine does this automatically). From there all the options are simple, but full shader replacements are one. In the api I work on the shaders have a built in hash value, so that along with the game identified means we know exsactly what shader it is. Most of the replacements aren't complicated though, it's just replacing slow things with faster things for our specific hardware. In the end we are the final compiler so us tweaking things to work better should be expected to a degree.
In the case of the Minecraft mod Sodium, which replaces much of Minecraft's rendering internals, Nvidia optimisations caused the game to crash.
So the mod devs had to implement workarounds to stop the driver from detecting that Minecraft is running... (changing the window title among other things)
And plain Minecraft did the opposite by adding -XX:JavaHeapDumpPath=MojangTricksIntelDriversForPerformance_javaw.exe_minecraft.exe.heapdump to the command line, when they changed the way the game started so that it wasn't detected as minecraft.exe any more. (Side effect: if you trigger a heap dump, it gets that name)
I would gauge properly testing on our card to be running the game occasionally on our cards throughout the whole development process. Currently it's been a lot of games 2 weeks out from release dropping a ton of issues on us to try and figure out. I don't even care if they don't do any optimization for our cards as long as they give us a fighting chance to figure out the issues on our side.
> Card devs also ought to respect that a game dev may have a good business reason for leaving his game running like crap on certain cards.
I can't agree with this though, business decisions getting in the way of gamers enjoying games should never be something that is settled for. If the hardware is literally to old and you just can't get it to work fine, but when a top of the line card is running below 60 fps that's just negligence.