> It has read every physics book, and can infer the Newtonian laws even if it didn't.
You're confusing "what it is possible to derive, given the bounds of information theory" with "how this particular computer system behaves". I sincerely doubt that a transformer model's training procedure derives Newton's Third Law, no matter how many narrative descriptions it's fed: letting alone what the training procedure actually does, that's the sort of thing that only comes up when you have a quantitative description available, such as an analogue sensorium, or the results of an experiment.
>when you have a quantitative description available, such as an analogue sensorium, or the results of an experiment.
Textbooks uniting the mathematical relationships between physics, raw math, and computer science - including vulnerabilities.
oeis.org and wikipedia and stackforums alone would approximate a 3D room with gravity and wind force.
now add appendixes and indices of un-parsed, un-told, un-realized mathematical errata et trivia minutiae, cross-transferred knowledge from other regions that have still have not conquered the language barrier for higher-ordered arcane concepts....
The models thought experiments are more useful than our realized experiments - if not at an individualized scale now, will be when subject to more research.
There could be a dozen faster inverse sqrt / 0x5F3759DF functions barely under our noses, and the quantifier and qualifier havent intersected yet.
You're confusing "what it is possible to derive, given the bounds of information theory" with "how this particular computer system behaves". I sincerely doubt that a transformer model's training procedure derives Newton's Third Law, no matter how many narrative descriptions it's fed: letting alone what the training procedure actually does, that's the sort of thing that only comes up when you have a quantitative description available, such as an analogue sensorium, or the results of an experiment.