1. Home >
  2. Internet & Security

Airline Forced to Honor Chatbot's Off-the-Cuff Refund Policy

It’s only a matter of time before Air Canada’s blunder becomes a legal precedent for other Canadians misled by customer service chatbots.
By Adrianna Nine
An Air Canada plane in the air.
Credit: John McArthur/Unsplash

Air Canada has been ordered to comply with an invented refund policy improvised by its website’s customer service chatbot. Though the airline has spent several months arguing it can’t be responsible for the chatbot’s inaccuracies, Canada’s Civil Resolution Tribunal has decided otherwise in what could become a game-changing decision for other chatbot-happy companies. 

It all started when Jake Moffatt’s grandmother died in November 2022. Moffatt immediately began looking into bereavement fares—reduced fares reserved for those who require transportation following a loved one’s passing—with Air Canada. Using the airline’s chatbot, Moffatt asked how Air Canada issues bereavement rates. The chatbot responded by saying travelers could buy a ticket and then apply for a reduced bereavement rate retroactively: “If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.”

Moffatt bought a one-way ticket to Toronto for the following day. Though the ticket cost $794, a follow-up conversation with an Air Canada representative revealed the same flight would cost $380 under Air Canada’s bereavement policy. A few days later, Moffatt submitted his ticket for a partial refund using the chatbot’s Ticket Refund Application form link. In the months following, Air Canada denied that bereavement fares could be provided retroactively, even after Moffatt provided screenshots of his conversation with the chatbot. In February 2023, an airline representative said the chatbot had used “misleading words” and that it would correct the chatbot’s verbiage, but otherwise refused to issue Moffatt a partial refund.

An Air Canada plane taxiing.
Credit: Isaac Struna/Unsplash

Moffatt eventually submitted a complaint with Canada’s Civil Resolution Tribunal, one of the world’s first public justice system platforms to operate online. Air Canada repeatedly asserted that it could not be held responsible for inaccuracies portrayed by the chatbot on its website, saying, “The chatbot is a separate legal entity that is responsible for its own actions.” But after a lengthy back-and-forth, tribunal member Christopher Rivers shot down Air Canada’s defense, saying there was no way Moffatt could have known one portion of the airline’s website was more accurate than the rest. Air Canada is now responsible for refunding Moffatt’s fare past the $380 bereavement rate and Moffatt’s tribunal filing fees.

Though all of this went down in Canada, it will surely set a precedent for companies using chatbots, not human customer service agents, to respond to potential customers’ inquiries. Though many of us know that the world’s most popular chatbots regularly make up and spread false information, it isn’t entirely unfair for consumers to expect companies’ chatbots to pull from their own refund policies, terms of service, or FAQs—information that, according to Rivers, would be consistent across the company's website.

Though the context differs slightly, it’s hard not to wonder what impact a case like this would have on other chatbot manipulations and pranks. In December, Chris Bakke—ironically the founder of an AI-powered virtual assistant for the real estate industry—“tricked” a Chevrolet dealer’s chatbot into agreeing to a $1 2024 Chevy Tahoe. Although the Chevrolet of Watsonville chatbot told Bakke they had a “legally binding offer—no takesies backsies,” the dealer ultimately laughed off the “deal” and removed the chatbot from its website.

Tagged In

Legal Artificial Intelligence

More from Internet & Security

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up