My point is that I don't accept the concept of unconscious thought. "Processing data similar to our thinking process" doesn't make it "thinking" to me, even if it comes to identical conclusions - just like it wouldn't be "thinking" to just read off a pre-recorded answer.
The idea of ChatGPT being asked to "think" just reminds me of Pozzo from Waiting for Godot.
The idea of ChatGPT being asked to "think" just reminds me of Pozzo from Waiting for Godot.