Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Florida student asks ChatGPT how to kill his friend, ends up in jail: deputies (wfla.com)
8 points by trhway 4 months ago | hide | past | favorite | 2 comments


They probably shouldn't announce this. I get that they're trying to do it as an example. But, now kids are gunna know how to download a local LLM and do it. At least this way you can catch them one at a time.


I don't think this is what he had in mind in trying to jailbreak the program...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: