Are you sure? Our senses have gaps that are being constantly filled all day long, it just gets more noticeable when our brain is exhausted and makes errors.
For example, when sleep deprived, people will see things that aren't there but in my own experience they are highly more likely to be things that could be there and make sense in context. I was walking around tired last night and saw a cockroach because I was thinking about cockroaches having killed one earlier but on closer inspection it was a shadow. This has happened for other things in the past like jackets on a chair, people when driving, etc. It seems to me at least when my brain is struggling it fills in the gaps with things it has seen before in similar situations. That sounds a lot like probabilistic extrapolation from possibilities. I could see this capacity extend to novel thought with a few tweaks.
> Given that GPT-4 is a simply large collection of numbers that combine with their inputs via arithmetic manipulation, resulting in a sequence of numbers, I find it hard to understand how they're "thinking".
Reduce a human to atoms and identify which ones cause consciousness or thought. That is the fundamental paradox here and why people think it's a consequence of the system, which could also apply to technology.
Are you sure? Our senses have gaps that are being constantly filled all day long, it just gets more noticeable when our brain is exhausted and makes errors.
For example, when sleep deprived, people will see things that aren't there but in my own experience they are highly more likely to be things that could be there and make sense in context. I was walking around tired last night and saw a cockroach because I was thinking about cockroaches having killed one earlier but on closer inspection it was a shadow. This has happened for other things in the past like jackets on a chair, people when driving, etc. It seems to me at least when my brain is struggling it fills in the gaps with things it has seen before in similar situations. That sounds a lot like probabilistic extrapolation from possibilities. I could see this capacity extend to novel thought with a few tweaks.
> Given that GPT-4 is a simply large collection of numbers that combine with their inputs via arithmetic manipulation, resulting in a sequence of numbers, I find it hard to understand how they're "thinking".
Reduce a human to atoms and identify which ones cause consciousness or thought. That is the fundamental paradox here and why people think it's a consequence of the system, which could also apply to technology.