To be fair (as one who hasn't paid too much attention to these developements in the last few months/years), the sheer amount of individual facts that are apparently encoded in a fixed nn architecture is what amazes me the most here?
I get that it has 175B variables, but it's not like you can used them in a structured way to store say a knowledge base...
It's ~650GiB of data. The entire English version of Wikipedia is about 20GiB and the Encyclopædia Britannica is only about 300MiB (measured in ASCII characters) [1].
I only use the sources available to me at the time.
Another value published by Wikipedia is 30GiB [1] as of 2020, which includes punctuation and markup.
I explicitly put the measurement unit as ASCII characters. If you have a better source for your size (remember: ASCII characters for article words only, no markup), feel free to post it.
To be fair (as one who hasn't paid too much attention to these developements in the last few months/years), the sheer amount of individual facts that are apparently encoded in a fixed nn architecture is what amazes me the most here?
I get that it has 175B variables, but it's not like you can used them in a structured way to store say a knowledge base...