I think the line you'd see is that there exists some task where the language-based model suddenly lacks the ability to perform the task despite the fact that it "should."
I'd conjecture that this might include something like describing where places are in relation to each other, and asking it to describe a route. (Not an NLP expert, but work with AI folks; this task chosen as an example because it seems like something you'd want a planner for rather than anything MLful.)
I'd conjecture that this might include something like describing where places are in relation to each other, and asking it to describe a route. (Not an NLP expert, but work with AI folks; this task chosen as an example because it seems like something you'd want a planner for rather than anything MLful.)