I think there's two ways of looking at it. Firstly that raster has more or less plateaued, there haven't been any great advances in a long time and it's not like AMD or any other company have offered an alternative path or vision for where they see 3d graphics going. The last thing a company like nvidia wants is to be a generic good which is easy to compete with or simple to compare against. Nvidia was also making use of their strength/long term investment in ML to drive DLSS
Secondly, nvidia are a company that want to sell stuff for a high asking price, and once a certain tech gets good enough that becomes more difficult. If the 20 series was just a incremental improvement from the 10, and so on then I expect sales would have plateaued especially if game requirements don't move much.
I don't believe we have reached a raster ceiling. More and more it seems like groups are cahoots to push rtx and ray tracing. We are left to speculate why devs are doing this. nvidiabux? easier time to add marketing keywords? who knows... i'm not a game dev.
There's no need for implications of deals between nvidia and game developers in smoke filled rooms. It's pretty straightforward: raytracing means less work for developers, because they don't have to manually place lights to make things look "right". Plus, they can harp about how it looks "realistic". It's not any different than the explosion of electron apps (and similar technologies making apps using html/js), which might be fast to develop, but are bloated and feel non-native. But it's not like there's an electron corp, giving out "electronbux" to push app developers to use electron.
>>We are left to speculate why devs are doing this.
Well, I am a gamedev, and currently lead of a rendering team. The answer is very simple - because ray tracing can produce much better outcomes than rasterization with lower load on the teams that produce content. There's not much else to it, no grand conspiracy - if the hardware was fast enough 20 years ago to do this everyone would be doing it this way already because it just gives you better outcomes. No nvidiabux necessary.
> There's not much else to it, no grand conspiracy
True, in that raytracing is the future. Though I don't think it's a conspiracy rather than just the truth that "RTX" as a product was Nvidia creating a 'new thing' to push AMD out of. Moat building, plain and simple. Nvidia's cards were better at it unsurprisingly, much like mesh shaders they basically wrote the API standard to match their hardware.
And just to make sure Nvidia doesn't get more credit than it deserves, the debut RTX cards (RTX 20 series) were a complete joke. A terrible product generation offering no performance gains over the 10 series at the same price with none of the cards really being fast enough to actually do RT very well. They were still better at RT than AMD though so mission accomplished I guess.
Raster quality is limited by how much effort engine developers are willing to put into finding computationally cheap approximations of how light/materials behave. But it feels like the easy wins are already taken?
All the biggest innovations in "pure" rasterization renderers in the last 10-15 years have actually been some form of raytracing in a very reduced, limited form.
Screenspace Ambient Occlusion? Marching rays (tracing) against the depth buffer to calculate a terrible but decent looking approximation of light occlusion. Some of the modern SSAO implementations like GTAO need to be denoised by TAA.
Screenspace Reflections? Marching rays against the depth buffer and taking samples from the screen to generate light samples. Often needs denoising too.
Light shafts? Marching rays through the shadow map and approximating back scattering from whether the shadowed light is occluded or not.
That voxel cone tracing thing UE4 never really ended up shipping? Tracing's in the name, you're just tracing cones instead of rays through a super reduced quality version of the scene.
Material and light behavior is not the problem. Those are constantly being researched too, but the changes are more subtle. The big problem is light transport. Rasterization can't solve that, it's fundamentally the wrong tool for the job. Rasterization is just a cheap approximation for shooting primary rays out of the camera into a scene. You can't bounce light with rasterization.
For rasterization to be useful it must approximately do the same thing that light does in the real world. Therefore rasterization that wants to get closer and closer to the real world will have to emulate more and more of the real world.
It will have to cast exactly the rays that rasterization is hoping to avoid.
Secondly, nvidia are a company that want to sell stuff for a high asking price, and once a certain tech gets good enough that becomes more difficult. If the 20 series was just a incremental improvement from the 10, and so on then I expect sales would have plateaued especially if game requirements don't move much.