No, the user prompt indicates that a person tried to convince the chatbot that it was 2023 after the chatbot had insisted that December 16 2022 was a date in the future
Screenshots can obviously be faked, but that's a superfluous explanation when anyone who's played with ChatGPT much knows that the model frequently asserts that it doesn't have information beyond 2021 and can't predict future events, which in this case happens to interact hilariously with it also being able to access contradictory information from Bing Search.
If ChatGPT wasn't at capacity now, I'd love to task it with generating funny scripts covering interactions between a human and a rude computer called Bing...
Screenshots can obviously be faked, but that's a superfluous explanation when anyone who's played with ChatGPT much knows that the model frequently asserts that it doesn't have information beyond 2021 and can't predict future events, which in this case happens to interact hilariously with it also being able to access contradictory information from Bing Search.