GenAI exposes limits of Canadian copyright law, say IP lawyers

Lawyers warn that without legal reform, compliance relies on assumptions and foreign frameworks

GenAI exposes limits of Canadian copyright law, say IP lawyers
Jean-Sébastien Dupont, Pina D’Agostino, Vincent Bergeron

Generative AI is disrupting Canada’s creative industries and exposing how unprepared copyright law is for infringement lawsuits. Legal experts warn that unresolved authorship and fair use questions in this new context create a minefield for artists and businesses.

At the heart of the issue is a growing divide between the rapid evolution of genAI tools and the slow pace of Canadian copyright reform. These tools are already used to generate everything from marketing content to deepfake scams, often trained on copyrighted material without the original creator's consent. And with no codified laws to determine ownership of AI-generated works, legal professionals say the risks are only growing.

GenAI and the limits of expression

Existing copyright law does protect creative industries, but grey areas abound.

"What you can protect is the expression of an idea, not the idea itself," says Jean-Sébastien Dupont, an IP litigator at Smart & Biggar LLP. He offers a familiar scenario: anyone can write a love story between people from different social classes. What they can’t do is copy Romeo and Juliet, characters and all.

"The question with genAI is whether its outputs cross that line," Dupont explains.

That line between inspiration and infringement is getting harder to draw. GenAI tools, trained on enormous datasets, including copyrighted works, can now replicate recognizable styles with uncanny precision. According to Dupont, certain tools that mimic a specific artistic language likely come dangerously close to copying an original expression, not just riffing on a genre or idea.

Still, legal guidance in Canada remains limited. "Copyright is entirely statutory here," Dupont says. "Judges interpret the law as it's written; they don’t create it. So, we're operating in a legal grey zone until Parliament steps in or courts issue clear rulings."

Companies are creating their own rules – for now

While lawmakers hesitate, businesses aren’t waiting. A late 2024 KPMG survey of Canadian organizations found that 61 percent had already adopted generative AI, and nearly 9 in 10 considered it critical to maintaining a competitive edge.

"Organizations are drafting their own rules of engagement," says Professor Giuseppina D’Agostino, an intellectual property scholar at Osgoode Hall Law School. However, these policies, while helpful, are fundamentally short-term.

Vincent Bergeron, leader of ROBIC’s emerging technologies group, believes the issue isn’t just legal, it’s geopolitical. “You just don’t know where it’s going and what to actually do,” he says, referring to the rapidly shifting AI regulatory landscape.

Over the past several months, Canadian companies have faced heightened uncertainty as US executive orders flip with changing administrations, European AI regulation advances, and domestic efforts like Bill C-27 stall.

Bergeron notes that this isn’t the first time companies have looked abroad for direction. “We saw this with [the] General Data Protection Regulation,” he says. “Most companies will try to find what the gold standard [is for] big economies and [aim to meet that] bar, even [if] it does not apply locally.”

However, while Canadian businesses look abroad for legal clarity, they also grapple with growing public skepticism. “In creative industries, just admitting you used genAI can trigger a public backlash,” he explains. Even when genAI is used solely for early concept development, some studios downplay or omit its role, fearing reputational fallout from online scrutiny.

However, as all experts note, genAI isn’t something companies can afford to ignore, but how they use it matters. Businesses must understand where the legal lines are, how to manage the risks, and when to exercise caution. The challenge isn’t using the tools; it’s using them blindly without understanding the legal and reputational risks they carry.

Deepfakes and the risk of digital misrepresentation

The risks for these businesses escalate when generative tools are no longer used purely for creative output but are instead repurposed for deception. CBC uncovered a surge in fraudulent online ads impersonating CBC News and prominent Canadian politicians to promote cryptocurrency scams. These ads falsely claim government endorsement to lure unsuspecting individuals.

Canadian law offers limited recourse for victims of this kind of digital impersonation. While common law protections against defamation and misrepresentation exist, no single statute governs deepfakes or the unauthorized commercial use of someone’s likeness.

"It’s a copyright question, but [there’s also an issue of] defamation," says D’Agostino. There are “various infringements taking place, which go beyond intellectual property. We need sui generis legislation to address these new cases specifically.”

The United States, for example, grants individuals personality rights. Canada, by contrast, offers limited statutory protection, relying instead on provincial rulings – most notably in Quebec, where civil law provides more defined standards around consent and likeness.

"We have a robust common law framework as well as precedents on this," D’Agostino explains. "But we need to put them into a legislative format to provide clarity in this new crypto-AI era."

Legal lag and the role of legislatures

Legislative reform in the copyright world is long overdue. Courts interpret the law, but they don’t write it. All three experts agree that the federal government needs to regulate generative AI.

"We’re in a grey zone," D’Agostino says. "I think it's helpful to have some bright lines in place. It's a matter of this new government showing some leadership again."

Bergeron shares the same sense of urgency. “The Copyright Act says it must be reviewed every five years. That hasn't happened in a meaningful way since 2012,” he says.

Without reform, creators and businesses are left debating whether using massive datasets to train genAI models qualifies as fair dealing, a question that remains unresolved. Bergeron says the ambiguity could fuel calls for royalties, echoing music industry norms.

Dupont agrees that the law is lagging but suggests that the more profound challenge is cultural. “There will always be someone who disagrees with the court’s ruling whether it favours AI companies or creators,” he says. “The bigger question is: do we, as a society, agree with that outcome?”

With the Canadian federal election underway, D’Agostino says it’s time for the federal government to act. “We need to think about our AI policies. It should be among the very first things a new government tackles.”