Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In general, a citation is something that needs to be precise, while LLMs are very good at generating some generic high probability text not grounded in reality. Sure, you could implement a custom fix for the very specific problem of citations, but you cannot solve all kinds of hallucinations. After all, if you could develop a manual solution you wouldn't use an LLM.

There are some mitigations that are used such as RAG or tool usage (e.g. a browser), but they don't completely fix the underlying issue.





My point is that citations are constantly making headlines, yet at least at first glance, seems like an eminently solvable problem.

So solve it?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: