Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Rag does fact search and dumps content relevant to the query into the LLM’s context window.

It’s like referring to your notes before answering a question. If your notes are good you’re going to answer well (barring a weird brain fart.) Hallucinating is still possible but extremely unlikely. And a post generation step can check for that and drop responses containing hallucinations



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: