Hacker News new | past | comments | ask | show | jobs | submit login

Wikipedia and LLMs are completely different entities that work in completely different ways.



And yet, much of the same criticism applies to both.


Wikipedia is generally right and you can check it's sources. Also everything there was written by a human.

LLMs are often wrong and you cannot check their sources. Also since it is generated you can trick it into spitting out falsehoods intentionally.

They are not even close.


You can also check the sources of LLMs, just ask them for it, and then check that. An LLM is simply more flexible and more powerful than Wikipedia, and thus you have to be more cautious with regards to its results.

"Generally right" is not the same as "reliably right", and therefore if you really need to rely on a fact for something important, I would trust neither Wikipedia nor LLMs.


> LLMs are often wrong and you cannot check their sources.

Depends on which LLM you are using, perplexity and copilot both cite their sources.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: