Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs just predict the next word, based upon the texts they were trained on. They are, by definition, a hallucination. It's just that, sometimes, they put words together that fool people.

They're just ELIZA on steroids, people.



Right and computers just do boolean math, not anything useful.


A hammer can bonk someone or help Jimmy Carter build a house for someone.

We digital logic folks are both tool builders and tool wielders. Usefulness is in the hands of the user, but that doesn't mean the creator isn't a dipsh_t who made a crap tool, or that the user isn't a moron.

My computer allows me to build stores of digital information representations, with functional integrations to other systems, including me. It's useful for me and my family, for sure.


How did ELIZA do on the Advent of Code?


It hallucinated less, that's for sure, because we know exactly how it works, though I doubt any of the AoC's challenges were suitable for its 1980s logic (that's when I typed in a magazine's C64 version).

"There's lies, damned lies, and statistics." --Unknown





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: