Training GPT-3 costs only a few million dollars. Scaling up is still pretty cheap. I wouldn't be surprised if we see a quadrillion parameter model in 5 years, given the potential value.
Google has actually already trained a trillion parameter model IIUC [1], though that was a Mixture of Experts so was way cheaper to train.
I expect similar things exist inside Google/Facebook, and they haven't been opened up to the public because nobody has figured out how to make them not say racist things, which isn't good for the reputation of a big tech company.