Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah; this should've been well within some second-tier customer service manager's discretionary refund budget. Instead they've got a precedent-setting ruling that makes the chatbot a huge liability.


> Instead they've got a precedent-setting ruling that makes the chatbot a huge liability.

Good.


Recall when the Chevy dealership's chatbot agreed to sell a Chevy Tahoe for $1...

https://www.businessinsider.com/car-dealership-chevrolet-cha...


If Chevy dealerships had $1 bereavement Chevy Tahoes as a normal business practice, I think that one would've gone the other way.

In the Air Canada case, it was a clear good-faith effort to understand the rules of a fare he was legitimately entitled to.


I think eventually courts will decide that whatever a chatbot or employee says while working for the company is binding on the company, as long as there was no malicious intent on either the part of the employee or the customer.


Probably not "whatever", but anything that passes a "reasonableness" standard. It's totally reasonable to expect that if an airline offers bereavement fares that you could file the paperwork a month later because, like, there's a lot going on if you need a bereavement fare and getting the paperwork together can take time.

There are lots of things that an employee might say that would not be reasonable, even if they had no malicious intent.


Yeah, reasonableness is the usual requirement in law unless otherwise specified. The ruling we're talking about here uses reasonableness several times:

"Generally, the applicable standard of care requires a company to take reasonable care to ensure their representations are accurate and not misleading."

"I find Air Canada did not take reasonable care to ensure its chatbot was accurate."

"Mr. Moffatt says, and I accept, that they relied upon the chatbot to provide accurate information. I find that was reasonable in the circumstances."


Australian consumer law basically already does this (for employees) esp. in the context of e.g. assuring fitness for purpose etc.


If it hasn't happened yet for employees, why do you think it will happen for chatbots?


A clear "paper" trail.

What is described here do happen when employees send mails with explicit promises, but gets harder when only the company has proof of the exchange (recording of the call). Chatbots bridge that gap.


Some people do record calls to companies. I choose to understand "Calls may be recorded" as permission and say "Thank you", even if perhaps that's not always how it was intended. When calling small very sketchy companies (such as my last landlord before I bought somewhere to live) I ask for permission and hang up if they refuse. Oh you don't want your words recorded? Fine, you're getting a letter, hope you weren't in a hurry.

But Chat Bots often provide a "Save transcript" feature, or even default to emailing you a copy if you're in a Customer Service type environment where it knows your email. So those are both a lot easier than setting up call recording.


Yes. iOS and default android are really doing a disservice by blocking call recording when in this day and age most real phone calls are done towards corporations or non private settings.


Yep. Intent matters.


> chatbot a huge liability

This is an appropriate outcome, in my view. I'm as pro-AI as they come. But I also recognize that without a clear standard of service delivery, and in an industry inundated with M&A instead of competition, that a chatbot isn't a helpful thing but a labor cost reduction initiative.


I hope they do become huge liabilities as it's irresponsible. I'm as excited as the next guy about the future, but companies shoehorning terrible "AI" to replace support channels is infuriating. The -only- good interaction I've had with a chatbot was the one Amazon uses or used to use, where you could click from a few options, and it'd just issue you a refund.


I wouldn’t exclude that it was made to escalate on purpose by those who are politically against the chatbot. Would have been a great move




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: