Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So are all the humans in this thread.

Except, human mimicry of "reasoning" is usually applied in service of justifying an emotional feeling, arguably even less reliable than the non-feeling machine.



It has served us relatively fine for thousands of years.

LLMs? I'm waiting for one that knows how not to say something that is clearly wrong with extreme confidence, reasoning or not.


Again, same can be said for humans.


Unless dealing with a psychopath you can deal with the lies using other subsystems.


The website that these comments are discussing (“Bullshit Machines”) says things that are probably wrong with extreme confidence




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: