Oh, one has been spectacularly useful to customers, by hallucinating a new, more customer friendly refund policy that courts held the company liable for:
Oh that is just excellent. I love the pathetic excuses they came up with trying to weasel their way out of having to honor promises made by their agents.
> Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt's case if its chatbot had warned customers that the information that the chatbot provided may not be accurate.
This is disappointing, though. Can I weasel out of contracts if I say that the information I'm providing may not be accurate before signing?
https://arstechnica.com/tech-policy/2024/02/air-canada-must-...
We absolutely need more chatbots like that.