In a recent, eye-opening legal case, an Air Canada passenger won against the airline and received a partial refund after being misled by the company’s AI assistant (or chatbot) about their bereavement policy. This incident raised significant questions about the responsibility and reliability of conversational AI in customer service.
We talk to Vincent Nichol, an aviation expert with Irwin Mitchell LLP, to understand what this means for businesses. Vincent has over a decade of experience claiming against and defending airlines in passenger claims often rooted in tort, contract, or contractual liability regimes.
See: Vincent Nichol - Aviation Claims Solicitor in London | Irwin Mitchel
What was Air Canada’s argument?
Air Canada argued that it cannot be held liable for information provided by one of its agents, servants, or representatives - including its chatbot service. Their argument suggested that their chatbot service was a separate legal entity without providing any evidence.
Is there anything to that idea?
Neither Air Canada's website nor their General Conditions of Carriage reference chatbot services, and they did not provide the relevant contract to the Court, severely weakening their argument.
Would that be specific to an airline or apply equally to all businesses?
To some degree, it depends on the nature of the business. A healthcare provider, for example, might have a fiduciary (in essence, a higher) duty of care towards its customers. General retail businesses would have a standard duty of care but might still be in breach and find themselves liable.
What precautions should a business take to avoid a situation like this?
Businesses must understand from this judgment that offering chatbot technology to customers - while very useful - does not negate those businesses from liability. Ultimately, it is likely to be construed as part and parcel of the information provided on their websites or in their Terms and Conditions. So, they are responsible for ensuring that the information provided is accurate.
Choosing Rasa for reliable generative conversational AI
Considering the incident with Air Canada, businesses need a solid conversational AI framework to avoid misinformation while enhancing customer trust. Rasa provides a next-level platform for building accurate, engaging text and voice-based AI assistants.
Here's why Rasa is the smart choice for your conversational AI needs:
- Accuracy and nuance: Rasa's innovation technology understands the complexity of human language, thereby minimizing misinformation risks. It ensures your AI assistants provide precise, hallucination-free responses to customer inquiries.
- Customization at the core: With Rasa, you can align your AI’s voice closely with your brand, ensuring interactions reflect your company's ethos and accuracy standards.
- Security you can trust: Protecting your data is a top priority at Rasa. Our platform features robust security measures and you control your own data to safeguard customer information and comply with laws like GDPR and HIPAA.
- Growth-ready: Rasa’s scalable solutions mean your conversational AI can grow alongside your business, ensuring quality customer service at every scale.
- Beyond transactions: Rasa elevates customer interactions from just conversations to meaningful engagements. Our platform is designed to create rich, personalized experiences that build loyalty and connection.
- Community and support: Choosing Rasa gives you extensive learning resources and a community of experts. We’re here to support you every step of the way, ensuring your success in the conversational AI space.
Rasa delivers a superior service that respects customer intelligence and fosters deeper connections. By partnering with Rasa, you ensure your conversational AI is a reliable, engaging extension of your brand, ready to meet customer needs with the correct information every time.
Embrace Rasa for your conversational AI projects to set your business apart with exceptional customer service. Read our whitepaper Navigating Compliance in Regulated Industries with CALM to learn more.