Skip to main content

You are using an outdated browser. Please upgrade your browser to improve your experience and security.

Read on

In NCR v KKB, 2025 ABKB 417, the Alberta Court of King’s Bench addressed not only a substantive family law appeal, but also a growing concern in Canadian litigation: the misuse of artificial intelligence tools for legal research.

The appellant, a self-represented mother, was partially successful in her appeal. However, when the court reviewed the written submissions she had provided, it discovered that six of the seven cited court decisions were inaccurate. Some of the case references looked like they were real, but the names didn’t match the actual cases, and the points being made weren’t found in the decisions themselves. One case appeared to be entirely fictitious. The court requested copies of the cases, but the appellant did not respond.

Although the judge found no intention to mislead and acknowledged the appellant’s limited legal knowledge, the court emphasized that citing legal authorities without verifying their accuracy is unacceptable. As noted in Zhang v Chen, 2024 BCSC 285, and quoted in the decision:

“Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court. Unchecked, it can lead to the miscarriage of justice.”

The court also noted the time and resources wasted in attempting to verify non-existent cases and confirmed that it had disregarded the appellant’s cited authorities in reaching its decision.


The Role of AI and the Problem of Fabricated Citations

The judgment inferred that the appellant may have relied on AI-generated content or internet searches to support her legal arguments. This is becoming a recurring problem in courtrooms across Canada and internationally. While generative AI tools such as ChatGPT can offer information that appears well-structured and authoritative, they are known to produce fictional case law with convincing formatting and fabricated quotes. These hallucinated citations can easily be mistaken for genuine legal sources.

In practice, a citation produced by AI might look like “ Smith v Jones, 2012 ABKB 224 ”. The format matches real reported decisions. But when checked in a trusted legal database, such as Canlii.org, the names of the parties may not match the citation, and the quoted material may be entirely fabricated. Even experienced legal professionals have encountered this issue when AI-generated results are used without careful cross-checking.

Consequences for the Litigant

Despite her partial success, the appellant was denied costs. Under Alberta’s Rule 10.31(5), self-represented litigants may be awarded costs in certain cases. However, the court held that awarding costs in this situation would not serve the purposes of cost awards, such as encouraging efficiency, penalizing improper conduct, or promoting settlement. Each party was therefore ordered to bear their own costs.


A Judicial Warning for the AI Era

NCR v KKB highlights the emerging challenges posed by the intersection of legal process and artificial intelligence. The court’s response makes clear that accuracy and reliability remain essential in legal submissions, regardless of the tools used to prepare them. While technology may offer new pathways to legal information, it cannot replace the need for careful research and proper verification.

This case serves as a warning to all court participants: even unintended errors can have real consequences when fictional case law enters the courtroom record.