> The way I read the comment in the context of the GP, schizophrenia starts to look a lot like a language prediction system malfunctioning.
That's what I was attempting to go for! Yes, mostly to give people in the thread that were remarking on the errors and such in ChatGPT a human example of the same type of errors (although schizophrenia is much more extreme). The idea really spawned from someone saying "what if we're all just complicated language models" (or something to that effect).
The way I read the comment in the context of the GP, schizophrenia starts to look a lot like a language prediction system malfunctioning.