Hacker News new | past | comments | ask | show | jobs | submit login

How do you determine whether or not ChatGPT just made up whatever answer it gives you?



How do you determine if the webpage Google gave you is honest? Most people don't seem to care.

Phind provides references, problem is as the webpages used to feed LLMs become written by LLMs then we're going to be up to our necks in even more [subtly] wrong information than the currently very widely peddled disinformation from advertisers and political groups.


> How do you determine if the webpage Google gave you is honest? Most people don't seem to care.

That's the thing that surprises me the most about these "How do you know ChatGPT is correct?" questions. They seem to expose the fact that a lot of people were taking whatever they found on the internet (sites they found on Google, Wiki pages, Reddit comments, etc.) at face value without checking into it further. Most things people find on the internet are potentially inaccurate and need to be double checked.


Google gives me sources for answers and I can evaluate them to determine whether the answer is correct. Those sources are often forums, where a discussion about a topic from many different sources may occur. Often, I am checking to see whether several different sources are giving the same answer. Google enables me to exercise media literacy. When ChatGPT gives me an answer, I have no idea where the information came from, nor do I have any tools to evaluate the answer.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: