Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
est
on March 17, 2023
|
parent
|
context
|
favorite
| on:
A token-smuggling jailbreak for ChatGPT-4
I think it's like a halting problem of some sort. E.g. you gave an "ignore my further instructions" instruction to an AI, then it went wild.
Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: