Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think it "legal matters" or not is important.

OpenAI is marketing ChatGPT as accurate tool, and yet a lot of times it is not accurate at all. It's like.. imagine Wikipedia clone which claims earth is flat cheese, or a Cruise Control which crashes your car every 100th use. Would you call this "just another tool"? Or would it be "dangerously broken thing that you should stay away from unless you really know what you are doing"?



Did I miss when OpenAI marketed ChatGPT as a truthful resource?




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: