Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It does this even if you give it instructions to make sure the code is truly in the code base? You never told it can’t lie.


Telling a LLM 'do not hallucinate' doesn't make it stop hallucinating. Anyone who has used an LLM even moderately seriously can tell you that. They're very useful tools, but right now they're mostly good for writing boilerplate that you'll be reviewing anyhow.



Funnily if you routinely ask them wether their answer is right, they fix it or tell you they hallucinated


That’s the thing about the GP. In a sense, this poster is actually hallucinating. We are having to “correct” their hallucination that they use an LLM deeply.


Nice troll bait, almost got me!




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: