Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My point is that I don't accept the concept of unconscious thought. "Processing data similar to our thinking process" doesn't make it "thinking" to me, even if it comes to identical conclusions - just like it wouldn't be "thinking" to just read off a pre-recorded answer.

The idea of ChatGPT being asked to "think" just reminds me of Pozzo from Waiting for Godot.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: