Well, he is making an analogy that real internal experience cannot be confirmed externally, however convincing the performance, but this is the only way we know about the internal experience of all things, including ones we typically assign "real" consciousness to (humans, dogs) and ones we don't (amoeba, zygotes, LLMs).
To be clear I'm not for a moment suggesting current AIs are remotely comparable to animals.
He is also not wrong about whether current AIs experience feelings. I suggest you learn more about the neuroscience of feelings.