Except, human mimicry of "reasoning" is usually applied in service of justifying an emotional feeling, arguably even less reliable than the non-feeling machine.
LLMs? I'm waiting for one that knows how not to say something that is clearly wrong with extreme confidence, reasoning or not.
Except, human mimicry of "reasoning" is usually applied in service of justifying an emotional feeling, arguably even less reliable than the non-feeling machine.