Hacker News new | past | comments | ask | show | jobs | submit login

Nothing is perfect, but some things let you validate the answer. Search engines give you search results, not an answer. You can use the traditional methods for evaluating the reliability of a resource. An academic resource focused on the toxicity of various kinds of chemicals is probably fairly trustworthy, while a blog from someone trying to sell you healing crystals probably isn't.

When you're using ChatGPT to find information, you have no information if what it's regurgitating is from a high reliability source or a low reliability source, or if it's just a random collection of words whose purpose is simply to make grammatical sense.




The most frustrating time I've had is asking it something, pointing out why the answer was obviously wrong, having it confirm the faulty logic and give an example of a portion of the answer now it would look like if it had used sound logic, followed by a promise to answer again without the accepted inaccuracy, only to get an even more absurd answer than the first time around.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: