For sure but I think people on the fine tuning/training/stable diffusion side are more concerned with that. They make a big fuss about this and basically talk people out of a perfectly good and well priced 16gb vram card that literally works out of the box with ollama, lmstudio for text inferencing.
Kind of one of the reasons AMD is a sleeper stock for me. If people only knew.