GPT-4 has 32k tokens of context. I'm sure someone out there is implementing the pipework for it to use some as a scratchpad under its own control, in addition to its input.
In the biological metaphor, that would be individual memory, in addition to the species level evolution through fine-tuning
Yeah, I’m doing that to get GPT-3.5 to remember historical events from other conversations. It never occurred to me to let it write it’s own memory, but that’s a pretty interesting idea.
Chat gpt changes when we train or fine-tune it. It also has access to local context within a conversation, and those conversations can be fed back as more training data. This is similar to a hard divide between short term and long term learning.
If it were able to modify it’s own model, and permanently execute on itself I’d be a lot more worried.