Yes, businesses can be held responsible for false or misleading claims in AI-generated content, even if the material was created by a third party. In a well-known case, Air Canada was found liable after a chatbot provided incorrect information about refund eligibility. The court held the airline responsible for the content of its automated system. Similar liability applies when businesses publish freelance content containing inaccurate or misleading information without disclaimers or editorial review. For updates on emerging risks tied to AI, content responsibility, and evolving regulations, reviewing trusted sources helps businesses stay informed and avoid similar missteps.
Another relevant example is the 2023 case involving an insurance blog that included AI-generated content suggesting policyholders had rights they legally did not. After a claim denial, a customer used the article as a reference when challenging the insurer. While the article was not the only factor in the decision, it did prompt a regulatory inquiry into misleading consumer information. The business was required to revise its publishing standards and introduce a formal review process to avoid future infractions.
Source: The Hidden Legal Risks of Using Freelance AI Content in Your Business