Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Hallucinations" (ie a chatbot blatantly lying) have always struck me as a skill issue with bad prompting. Has this changed recently?

to a skilled user of a model, the model won't just make shit up.

Chatbots will of course answer unanswerable questions because they're still software. But why are you paying attention to software when you have the whole internet available to you? Are you dumb? You must be if you aren't on wikipedia right now. It's empowering to admit this. Say it with me: "i am so dumb wikipedia has no draw to me". If you can say this with a straight face, you're now equipped with everything you need to be a venture capitalist. You are now an employee of Y Combinator. Congratulations.

Sometimes you have to admit the questions you're asking are unlikely to be answered by the core training documents and you'll get garbled responses. confabulations. Adjust your queries accordingly. This is the answer to 99% of issues product engineers have with llms.

If you're regularly hitting random bullshit you're prompting it wrong. Models will only yield results if they get prompts they're already familiar with. Find a better model or ask better questions.

Of course, none of this is news to people who actually, regularly talk to other humans. This is just normal behavior. Hey maybe if you hit the software more it'll respond kindly! Too bad you can't abuse a model.



A sufficiently skilled person can use a warped and bent slide rule to design a jet engine.

But that doesn't mean the warped slide rule and a super computer capable of finite element analysis are equally useful or powerful.


You can easily crush a person with a super computer. But what are you going to do with a slide rule—lightly tap someone?


You are lacking in imagination.

(Alas, Dall-E also lacks, so I couldn't generate a picture of deadly slide rule kung fu. At least none that wasn't unintentionally hilarious.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: