If we’re going to keep up these kilowatt scale cards, we’re just going to need higher voltage rails on PSUs. I had a bunch of similar dumb power connector problems when my 4090 was new.
19V is pretty standard in notebooks so 19-24V could probably be done with fairly little trouble. 48V would entail developing a whole new line of capacitors, inductors, power stages (transistors), and controllers.^1
^1: yes, of course, 48V compatible components exist for all of those. But the PC industry typically gets components developed for its specific requirements because it has the volume.
The computer industry already has these. 48-54V PoE (power over ethernet) that ends up being used to power equipment designed to operate at ranges of 1.8-3.3V is extremely common.
TL;DR: Yes there is a small difference in efficiency, but it's still plenty efficient.
You need a switching regulator for the current 12V anyway (as opposed to linear regulator which are much simpler but basically just burn power to reduce voltage) so the question is if increasing the voltage 2-4x while keeping same power requirements makes a difference.
- You need higher voltage rated components (mainly relevant for capacitors), potentially bit more expensive but negligible for GPU costs. The loss due to inductor will be higher too (same DC resistance but higher voltage => higher current, more power), but this is negligible.
- On the other hand you need less thick traces/copper, and have more flexibility in the design.
For some concrete numbers, here is a random TI regulator datasheet [1], check out figure 45. At 5V 3A output, the difference in efficiency between 12V, 24V and 42V inputs is maybe 5%.
I think the main problem is the industry needs to move together. Nvidia can't require 24/48V input before there is standard for it and enough PSUs on the market offer it. This seemingly chicken-and-egg situation has happened in the past a bunch of times, so it's not a big problem, but will take a while.