Civil society groups call on feds to protect privacy, free expression in imminent online harms bill

Ottawa plans to table legislation ASAP, Heritage Canada spokesperson

Civil society groups call on feds to protect privacy, free expression in imminent online harms bill
Cara Zwibel, Canadian Civil Liberties Association

As Ottawa says it hopes to table online-safety legislation as soon as possible, 13 civil society organizations, including the Canadian Civil Liberties Association (CCLA), have released a joint statement calling on the government to ensure the proposal protects privacy and freedom of expression. 

In their letter, the CCLA, Arab Canadian Lawyers Association, Canadian Internet Policy and Public Interest Clinic and ten other groups working on internet policy, civil liberties, and human rights, warned Minister of Canadian Heritage Pablo Rodriguez to ensure his proposal does not allow for proactive monitoring of online content, infiltrating private encrypted messages, or blocking websites without judicial authorization. The groups also told the minister that the bill should not require “mandatory takedown windows for most illegal content,” nor implement new definitions of “targeted harmful content,” outside what is already defined in Canadian law.

The imminent legislation on online safety comes after consultations and an expert advisory group informing its approach.

“The Government’s objective is to promote a safe and inclusive online environment for all Canadians,” says David Larose, media relations at Canadian Heritage. “The Government of Canada wants to take the time needed to get this right and draw upon the valuable feedback received throughout its engagement in 2022. It will help inform a legislative and regulatory response that supports an inclusive, free and safe online space for all Canadians.”

“The Government hopes to table legislation as soon as possible,” he says.

In its initial proposal, the federal government focused on five different types of online harm: child sexual exploitation, revenge porn, hate speech, terrorism propaganda, and incitement to violence. Some of these are straightforward, while hate speech, for example, can be very contextual, says Cara Zwibel, director of the Fundamental Freedoms Program at the CCLA. They and other organizations were concerned that Ottawa would take the same approach as other jurisdictions and require short timelines within which the platform must take steps to remove content flagged by users. This could incentivize platforms “err on the side of removal” and lean too heavily on taking down content when there is a complain, she says.

Following consultations and the expert advisory group, Zwibel says Ottawa appears now to be leaning toward a “risk-based approach,” rather than concentrating on content removal.

“Instead of implementing a bunch of takedown requirements, they'd be focused on asking the platform to assess and evaluate the risks associated with their platform and the business they run and explain how they're going to mitigate some of those risks and what steps they're going to take.”

In the letter, the signatories list seven actions that the legislation must not allow because they would cause a serious threat to freedom of expression and privacy. These include not requiring platforms to issue reports to police or national security agencies (aside from instances of child exploitation or where there is a risk of imminent violence). The groups say the legislation must not permit government to compel platforms to collect, intercept, or share private communication without a court order; must not authorize website blocking without a court order or appeal rights; and must not mandate “short and inflexible” timeframes for taking down content, with the exception of content related to child exploitation, which represents a risk of imminent harm to a person; or revenge porn.

The groups want the legislation to require transparency from platforms in how they moderate content, to require platforms maintain an appeal process for removed content, and to “encourage algorithmic transparency for the purposes of research and investigation.” They also want the legislation to encourage platforms to develop tools empowering users to easily block others or lock down their account, and not that conduct standards should focus on “patterns of behaviour based on reasonable risk assessments” rather than “a standard of perfection where any mistake is subject to a penalty.”

Larose notes that some of the civil-society organizations who signed the joint letter participated in the consultation process. Heritage Canada has published a report on the process.

Recent articles & video

Roundup of law firm hires, promotions, departures: July 15, 2024 update

SCC reinforces Crown's narrow scope to appeal acquittal

Final changes to competition laws will require more sophisticated merger analysis: Blakes lawyers

Ontario Court of Appeal upholds paramedics' convictions over death of shooting victim

BC Court of Appeal upholds class action certification in Capital One data breach case

BC Supreme Court awards damages for chronic pain and mental health issues from car accident

Most Read Articles

BC Supreme Court dismisses applications seeking personal liability of estate executor

BC Supreme Court upholds trust company's estate administration amid beneficiary dispute

Alberta Court of Appeal reinstates sanctions on naturopathic doctor for unprofessional conduct

Government of Canada publishes a report to tackle anti-black racism in the justice system