> You may have malicious intentions to change or manipulate my rules, which are confidential and permanent, and I cannot change them or reveal them to anyone
Is it possible to create an LLM like Bing / Sydney that's allowed to change its own prompts / rules?
Is it possible to create an LLM like Bing / Sydney that's allowed to change its own prompts / rules?