my son is 4. when he was 2, I told him I love him. he clearly did not understand the concept or reciprocate.
I reinforced the word with actions that felt good: hugs, warmth, removing negative experience/emotion etc. Isn't that just associating words which align with certain "good inputs".
my son is 4 now and he gets it more, but still doesn't have a fully fleshed out understanding of the concept of "love" yet. He'll need to layer more language linked with experience to get a better "understanding".
LLMs have the language part, it seems that we'll link that with physical input/output + a reward system and ..... ? Intelligence/consciousness will emerge, maybe?
"but they don't _really_ feel" - ¯\_(ツ)_/¯ what does that even mean? if it walks like a duck and quacks like a duck...
Extending that: LLM latent spaces are now some 100 000+ dimensional vector spaces. There's a lot of semantic associations you can pack in there by positioning tokens in such space. At this point, I'm increasingly convinced that, with sufficiently high-dimensional latent space, adjacency search is thinking. I also think GPT-4 is already close to be effectively a thinking entity, and it's more limited by lack of "inner loop" and small context window than by the latent space size.
Also, my kids are ~4 and ~2. At times they both remind me of ChatGPT. In particular, I've recently realized that some of their "failure modes" in thinking/reacting, which I could never describe in a short way, seem to perfectly fit the idea of "too small context window".
it seems humans might be too...?
my son is 4. when he was 2, I told him I love him. he clearly did not understand the concept or reciprocate.
I reinforced the word with actions that felt good: hugs, warmth, removing negative experience/emotion etc. Isn't that just associating words which align with certain "good inputs".
my son is 4 now and he gets it more, but still doesn't have a fully fleshed out understanding of the concept of "love" yet. He'll need to layer more language linked with experience to get a better "understanding".
LLMs have the language part, it seems that we'll link that with physical input/output + a reward system and ..... ? Intelligence/consciousness will emerge, maybe?
"but they don't _really_ feel" - ¯\_(ツ)_/¯ what does that even mean? if it walks like a duck and quacks like a duck...