Ah, but isn’t that the problem here - asking an LLM for facts without requesting a search is like asking a PhD to answer a question “off the top of your head”. For pop culture questions the PhD likely brings little value.
They should know better than to guess. Educated, honest, intelligent people don't spout off a wild ass guess, if they don't know something they say so.
I don't think they mean "knowledge" when they talk about "intelligence." LLMs are definitely not knowledge bases. They can transform information given to them in impressive ways, but asking a raw (non-RAG-enabled) LLM to provide its own information will probably always be a mistake.
They kind of are knowledge bases, just not in the usual way. The knowledge is encoded in the words they were trained on. They weren't trained on words chosen at random; they were trained on words written by humans to encode some information. In fact, that's the only thing that makes LLMs somewhat useful.