Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What on Earth makes you think that training a model on all factual information is going to do a lick in terms of generating factual outputs?

At that point, clearly our only problem has been we've done it wrong all along by not training these things only on academic textbooks! That way we'll only probabilistically get true things out, right? /s



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: