Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What does that have to do with whether or not LLMs spit out bullshit or not?


Wouldn't it increase the probability of bullshit, if they're trained on bullshit.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: