[ad_1]
NEW DELHI: Canada’s largest airline, Air Canada, has been directed to pay compensation to a customer who was provided with inaccurate information by the airline’s chatbot.
According to the Guardian, Jake Moffatt contacted Air Canada in 2022 seeking information about the documents required for a bereavement fare and the possibility of retroactive refunds. According to Moffatt’s conversation screenshot with the chatbot, he was incorrectly informed that he could apply for a refund within 90 days of ticket issuance by completing an online form.
Moffatt then booked tickets to and from Toronto for a family funeral. However, when he applied for a refund, Air Canada said that bereavement rates did not apply to completed travel, redirecting him to the bereavement section on the company’s website. Air Canada later acknowledged the chatbot’s use of “misleading words” and promised to update the chatbot.
Moffatt filed a lawsuit seeking the fare difference, prompting Air Canada to assert a defense. The airline argued that the chatbot was a “separate legal entity” responsible for its actions.
Tribunal member Christopher Rivers termed this argument “remarkable,” emphasizing that Air Canada, as the platform owner, is ultimately responsible for all information on its website.
While Air Canada claimed correct information was available on its website, Rivers questioned why the webpage titled ‘Bereavement Travel’ was considered inherently more trustworthy than the chatbot. He emphasized that there is no reason for customers to discern the accuracy of different sections on the airline’s website.
Air Canada’s chatbot was initially introduced as an AI experiment to ease the load on the call center during unexpected flight disruptions.
Rivers concluded, “It should be obvious to Air Canada that it is responsible for all the information on its website, whether it comes from a static page or a chatbot.” and directed Air Canada to pay Moffatt C$650.88, representing the fare difference and additional amounts.
According to the Guardian, Jake Moffatt contacted Air Canada in 2022 seeking information about the documents required for a bereavement fare and the possibility of retroactive refunds. According to Moffatt’s conversation screenshot with the chatbot, he was incorrectly informed that he could apply for a refund within 90 days of ticket issuance by completing an online form.
Moffatt then booked tickets to and from Toronto for a family funeral. However, when he applied for a refund, Air Canada said that bereavement rates did not apply to completed travel, redirecting him to the bereavement section on the company’s website. Air Canada later acknowledged the chatbot’s use of “misleading words” and promised to update the chatbot.
Moffatt filed a lawsuit seeking the fare difference, prompting Air Canada to assert a defense. The airline argued that the chatbot was a “separate legal entity” responsible for its actions.
Tribunal member Christopher Rivers termed this argument “remarkable,” emphasizing that Air Canada, as the platform owner, is ultimately responsible for all information on its website.
While Air Canada claimed correct information was available on its website, Rivers questioned why the webpage titled ‘Bereavement Travel’ was considered inherently more trustworthy than the chatbot. He emphasized that there is no reason for customers to discern the accuracy of different sections on the airline’s website.
Air Canada’s chatbot was initially introduced as an AI experiment to ease the load on the call center during unexpected flight disruptions.
Rivers concluded, “It should be obvious to Air Canada that it is responsible for all the information on its website, whether it comes from a static page or a chatbot.” and directed Air Canada to pay Moffatt C$650.88, representing the fare difference and additional amounts.
[ad_2]
Source link