Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
sabas123
6 months ago
|
parent
|
context
|
favorite
| on:
Modern-Day Oracles or Bullshit Machines? How to th...
If you make an LLM which design goal is to state "I do not know" any answer that is not directly in its training set, then all of the above statements don't hold.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: