Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

would anyone pay for an llm that can parse 10k reports hallucination free?

was exploring this idea recently maybe I should ship it



Grok 4 SuperHeavy can almost certainly do this out of the box?


I haven't tried SuperHeavy, but why would it? all transformer based LLM's are pretty prone to hallucinations even with RAG... it can be pretty good I guess

any articles to learn more about it?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: