On September 26, 2025, the Court of Appeal of Alberta (ABCA) released its first decision addressing the potential for generative artificial intelligence (GenAI) tools to create references to non-existent case law or statutes, commonly referred to as “hallucinations.”
In Reddy v. Saroya (Reddy), the appellant’s factum contained citations to non-existent cases, which appeared to be created by GenAI. The ABCA is considering awarding enhanced costs against the appellant’s counsel personally for failing to confirm the accuracy of their cited cases.
This appeal is one of several recent cases from across Canada involving the submission of “hallucinated” case law to the courts.
The Reddy Appeal
Reddy was an appeal of a contempt order related to 28 undertaking responses. The appellant’s factum did not provide any hyperlinks or copies of cases to support his case law citations. The respondent highlighted this as a potential GenAI issue, noting that seven of the cases cited by the appellant could not be found and likely did not exist.
The appellant’s counsel explained to the ABCA that a third-party contractor had assisted in preparing the appellant’s materials and had given assurances that GenAI had not been used in drafting the materials. However, counsel for the appellant conceded that “the contractor’s explanation may not have been true.” Counsel for the appellant claimed that he had received the materials from the contractor late and that, as a result, he had insufficient time to properly verify the cited cases.
The respondent sought enhanced costs payable by the appellant’s counsel personally. The respondent argued that the use of GenAI was an abuse of process and that the respondent’s counsel had spent a significant amount of time searching for cases that did not exist. The ABCA said that, given the circumstances, it was considering imposing costs against the appellant’s counsel personally and sought further written submissions.
The Law Society of Alberta's Guidance on GenAI
In Reddy, the ABCA noted that the use of GenAI in legal practice engages lawyers’ existing obligations pursuant to the Code of Conduct, including the requirement to perform all legal services to the standard of a competent lawyer. The Law Society of Alberta has also published guidelines and practice notes specifically concerning the use of GenAI. Among these are “The Generative AI Playbook” and the October 6, 2023, Notice to the Public and Legal Profession titled Ensuring the Integrity of Court Submissions When Using Large Language Models. The October 2023 Notice emphasizes the importance of keeping a “human in the loop” to verify any materials created by a GenAI tool.
These resources emphasize that parties should exercise caution when using GenAI and should thoroughly review and verify any GenAI materials, given the risk of hallucinations, risk of copyright infringement, and risks to privilege and confidentiality. When referring to cases, statutes or commentary, parties should rely exclusively on authoritative sources, such as official court websites, commercial publishers or well-established public services such as CanLII.
In Reddy, the ABCA made clear that lawyers must plan ahead for the time needed to verify authorities generated by a GenAI tool as part of a lawyer’s practice management responsibilities. If a lawyer engages another party to prepare material to be filed with the Court, the lawyer whose name appears on the filed document bears the ultimate responsibility for the material’s contents.
An Evolving Issue
The Reddy decision is one of several recent Canadian decisions about the potential risks of using GenAI in litigation.
In 2024, the British Columbia Supreme Court ordered that costs of an application be personally borne by counsel to one of the litigants as a result of their reliance on “hallucinated” case law (Zhang v. Chen). In May of this year, the Ontario Superior Court of Justice warned lawyers about the potential ethical and legal issues arising from failure to properly review GenAI content (Ko v. Li). Earlier this summer, the Court of King’s Bench of Alberta also warned self-represented litigants about the risk of “hallucinated” case law and emphasized that “it is unacceptable to refer to legal authorities without verifying the accuracy of the information contained therein” (NCR v. KKB).
It remains to be seen what quantum of costs, if any, the ABCA will order the appellant’s counsel in Reddy to pay personally. As the legal industry continues to adapt to GenAI, it is critical for lawyers to be well-informed about the risks and opportunities of using these tools in their own practice.
For more information, please contact the authors or any other member of our Litigation & Dispute Resolution group.