Maybe Nvidia, but they are a chip / hardware maker first. And even for them 50B training run with no exponential gains seems unreasonable.
Better to optimize the architecture / approach first, which also is what most companies are doing now before scaling out.