Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But the LLM is going to do what its prompt (system prompt + user prompts) says. A human being can reject a task (even if that means losing their life).

LLMs cannot do other thing than following the combination of prompts that they are given.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: