Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What does that mean for LLMs? The Internet has lots of poorly written text. If LLMs can't distinguish nuance, ambiguity, or lack of clarity then what exactly are they generating and why would their output be useful?

Taking a poorly written sentence, interpreting it as meaning something incorrect, and then presenting it with authoritative, confident language is very close to gas lighting.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: