Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know what the expression means and tend to agree with the duck test. I just disagree that ChatGPT passes the "lying duck" test. A "lying duck" would be more systematic and consistent in its output of false information. ChatGPT occasionally outputs incorrect information, but there's no discernable motive or pattern, it just seems random and unintentional.

If it looked like ChatGPT was intentionally being deceptive, it would be a groundbreaking discovery, potentially even prompting a temporary shutdown of ChatGPT servers for a safety assessment.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: