What is your "almost certainty" based on? What does it even mean? Every thread on LLMs is full of people insisting their beliefs are certainties.
What I'm certain is we should not praise the inventor of ball bearings for inventing flight, nor once ball bearings were invented flight became unavoidable and only a matter of time.
I say 'almost certainly' because LLMs are basically just a way to break down language into it's component ideas. Any AGI level machine will most certainly be capable of swapping sematic 'interfaces' at will, and something like an LLM is a very convenient way to encode that interface.
What I'm certain is we should not praise the inventor of ball bearings for inventing flight, nor once ball bearings were invented flight became unavoidable and only a matter of time.