Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This seems like exactly the opposite of everything I've read from the rationalists. They even called their website "less wrong" to call attention to knowing that they are probably still wrong about things, rather than right about everything. A lot of their early stuff is about cognitive biases. They have written a lot about "noticing confusion" when their foundational beliefs turn out to be wrong. There's even an essay about what it would feel like to be wrong about something as fundamental as 2+2=4.

Do you have specific examples in mind? (And not to put too fine a point on it, do you think there's a chance that you might be wrong about this assertion? You've expressed it very confidently...)



They're wrong about how to be wrong, because they think they can calculate around it. Calling yourself "Bayesian" and calling your beliefs "priors" is so irresponsible it erases all of that; it means you don't take responsibility if you have silly beliefs, because you don't even think you hold them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: