Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

No, the user prompt indicates that a person tried to convince the chatbot that it was 2023 after the chatbot had insisted that December 16 2022 was a date in the future

Screenshots can obviously be faked, but that's a superfluous explanation when anyone who's played with ChatGPT much knows that the model frequently asserts that it doesn't have information beyond 2021 and can't predict future events, which in this case happens to interact hilariously with it also being able to access contradictory information from Bing Search.



"I can give you reasons to believe why it is 2022. If you will let me guide you."

Did I read that wrong? Maybe.


I think that's a typo on the user's behalf, it seems counter to everything they wrote prior. (And Bing is already adamant it's 2022 by that point.)


Plausible. It seems to me the chatbot would have picked that up though.

There's a huge incentive to make this seem true as well.

That said, I'm exercising an abundance of caution with chatbots. As I do with humans.

Motive is there, the error is there. That's enough to wait for access to assess the validity.


From the Reddit thread on this, yes, the user typo'ed the date here and tried to correct it later which likely lead to this odd behavior.


heh i wonder if stablediffusion can put together a funny ChatGPT on Bing screenshot.


If ChatGPT wasn't at capacity now, I'd love to task it with generating funny scripts covering interactions between a human and a rude computer called Bing...


Sure, if you don't mind all the "text" being asemic in a vaguely creepy way.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: