Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not directly addressing your point but asking an LLM to not include something in its output often doesn't work well. Its a bit like saying to someone "whatever you do, don't think about elephants."


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: