You forget that it makes stuff up and you won't know it until you google it. When googling, fake stuff stands out because truth is consistent.
Querying multiple llms at the same time and being able to compare results is a much better comparison to googling but no one does this.
As I said, you are talking to a super confident journalist intern who can give you answers but you won't know if it is true or partially true until you consult with a human source of knowledge.
It's not even similar to asking the old guys at the Home Depot because they can tell you if they are unsure they have a good answer for you. An LLM won't. Old guys won't hallucinate facts the way an LLM will
It is really is the 21st century Searle's epistemological Chinese room nightmare edition. Grammar checks out but whatever is spit out doesn't necessarily bear any resemblance to reality
LLMs train from online info. Online info is full of misinformation. So I would not trust an answer to be true just because it is given by multiple LLMs. That is actually a really good way to fall into the misinformation trap.
My point was that googling gets you a variety of results from independent sources. So I said that querying multiple LLMs is as close as you can get for a similar experience.
I agree with everything you said, except I think we're both right at the same time.
Ol' boy at the Depot is constrained by his own experiences and knowledge, absolutely can hallucinate, oftentimes will insert wild, irrelevant opinions and stories while getting to the point, and frankly if you line 6 of them up side by side to answer the same question, you're probably leaving with 8 different answers.
There's never One True Solution (tm) for any query; there are 100 ways to plumb your way out of a problem, and you're asking a literal stranger who you assume will at least point you in the right direction (which is kind of preposterous to begin with)
I encourage people to treat LLMs the same way -- use it as a jumping off point, a tool for discovery that's no more definitive than if you're asking for directions at some backwoods gas station. Take the info you get, look deeper with other tools, work the problem, and you'll find a solution.
Don't accept anything they provide at face value. I'm sure we all remember at least a couple teachers growing up who were the literal authority figures in our lives at the time, fully accredited and presented to us as masters of their curriculum, who were completely human, oftentimes wrong, and totally full of shit. So goes the LLM.
Querying multiple llms at the same time and being able to compare results is a much better comparison to googling but no one does this.
As I said, you are talking to a super confident journalist intern who can give you answers but you won't know if it is true or partially true until you consult with a human source of knowledge.
It's not even similar to asking the old guys at the Home Depot because they can tell you if they are unsure they have a good answer for you. An LLM won't. Old guys won't hallucinate facts the way an LLM will
It is really is the 21st century Searle's epistemological Chinese room nightmare edition. Grammar checks out but whatever is spit out doesn't necessarily bear any resemblance to reality