Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Florida student asks ChatGPT how to kill his friend, ends up in jail: deputies
(
wfla.com
)
8 points
by
trhway
4 months ago
|
hide
|
past
|
favorite
|
2 comments
quantumcotton
4 months ago
|
next
[–]
They probably shouldn't announce this. I get that they're trying to do it as an example. But, now kids are gunna know how to download a local LLM and do it. At least this way you can catch them one at a time.
higginsniggins
4 months ago
|
prev
[–]
I don't think this is what he had in mind in trying to jailbreak the program...
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: