Lawyers react to proposed regulatory framework for 'harmful online content'

Proposals raise issues concerning freedom of expression, unreasonable search and seizure: lawyers

Lawyers react to proposed regulatory framework for 'harmful online content'
Maanit Zemel, David Fraser, Joanna Baron

Heritage Canada has released a proposed regulatory framework to establish rules by which social media platforms and other online services deal with “harmful online content,” such as child porn, terrorist content, hate speech and non-consensual sharing of intimate images.

The plan poses problems for Charter protections such as freedom of expression and protection against unreasonable search and seizure, say two lawyers who spoke with Canadian Lawyer. Another says that, while “the devil is in the details” and more will be revealed when a bill is drafted, the underlying purpose of providing victims of online harms a more practical recourse is welcome and necessary.

In its prelude to a future bill, Heritage Canada proposes new rules and obligations for online platforms, a modification of Canada’s existing laws and the establishment of three new regulatory bodies: a Digital safety Commissioner, a Digital Recourse Council of Canada and a Digital Safety Commission.

“My first question is why is this proposal being presented by the Minister of Heritage?” says David Fraser, an internet, technology and privacy lawyer at McInnes Cooper in Halifax. “If the objective of the bill is to be able to get off the internet harmful content… I think this could be accomplished with a simple amendment to the Criminal Code, and a process for getting a court order, rather than this whole bureaucracy in Ottawa.”

“We think it's a solution in search of a problem,” says Joanna Baron, executive director of the Canadian Constitution Foundation (CCF). “We think that there are already… criminal code provisions against this speech. And so, I'm not sure what the justification is for building a regulatory superstructure with enhanced police powers.”

A significant concern for Fraser and Baron is the proposal of a 24-hour take-down requirement for harmful content.

Under “New rules and obligations,” Heritage Canada suggests the Act should provide that online platforms address content flagged as harmful within 24 hours of receiving a complaint. In that timeframe, the platform must respond to the complainant and state that the content either does not meet the definition of harmful content, or that it does meet the definition and then the platform must make that content inaccessible in Canada. 

The quick turnaround works for content that is clearly illegal, but anyone familiar with content-moderation decision-making on internet platforms knows there is “a whole lot of grey area,” says Fraser. Complicating the matter further is that algorithms, as well as people, will be doing the moderating.

“Can you determine whether or not content that relates to terrorism is likely to incite terrorism, or whether it's just about terrorism?” says Fraser.

Faced with significant fines, platforms will err on the side of caution and take down content simply because they received a complaint, he says.

“I think that is probably the most significant new rule in terms of freedom of expression,” says Baron.

Maanit Zemel says it is positive the government is establishing “practical recourse” for victims of online harms, because “currently, victims of these harms have no recourse.”

“They can go to the police, but law enforcement has very limited resources and incentives, in a lot of ways. They don't spend a lot of time or effort or money or know-how in prosecuting these offenses,” she says. “And when they do, it's long, it's arduous and the victims themselves don't really see any practical results because the content continues to exist online.”

The content remains online because online platforms currently have no incentive to remove, and it is entirely up to them whether to do so, says Zemel, a civil and commercial litigator with a focus on Canada’s Anti-Spam Legislation, online defamation, privacy and data security. If platforms opt not to remove the content, complainants will be able to file complaints with the various regulatory authorities established under the Act.

“It is intended to provide a relatively efficient, quick, and, I guess, discrete process for victims to deal with these issues. Because currently, they don't have that.”

Under the rules for the Digital Recourse Council, the proposals state that the Act should allow the Digital Safety Commissioner to conduct in-camera hearings, “where a public hearing would not be in the public interest, including where there is a privacy interest, national security interest, international relations interest, national defence interest, or confidential commercial interest.”

“These are pretty broad categories,” says Baron. “So, there's a lot of scope for the government to demand a private hearing, if there's some dispute over whether a platform is to take down disputed content.”

“That's a violation of the general principle of open courts; that it's only in exceptional circumstances that court procedures should be held in secret. But anything touching on privacy, security, international relations – that would seem to go beyond ‘exceptional circumstances.’”

Baron also takes issue with proposed “Inspection Powers” that would allow an inspector to enter “any place” – aside from a “dwelling house” without consent or a warrant – if there are reasonable grounds to believe the site contains material “relevant to the purpose of verifying compliance and preventing non-compliance with the Act, regulations, decisions and orders.” There, the inspector can access any computer system and examine any data available within. These powers pose a s. 8 issue, says Baron, who, prior to the CCF, was a criminal defense lawyer with Greenspan Partners.

“The Supreme Court of Canada has been very clear that personal computers attract a very high level of privacy because they're seen as containing a lot of information,” she says.

“Even employees at a company have a right to privacy on computers that are company property.”

Heritage Canada is also proposing that the legislation allow the Digital Safety Commissioner the authority to apply to the Federal Court for an order to block access to an offending website, if the site is repeatedly non-compliant, hosts child porn or terrorist content and all other enforcement measures have been exhausted.

“That's the first time that I'm aware of a potential piece of legislation that that mandates a site blocking order,” says Zemel. “… This would be certainly very much precedent-setting, not just nationally but internationally, in a western country.”

“The possibility of blocking websites – like full websites, not just a link to a particular video or a link to a particular article – is a pretty dramatic step,” says Fraser.

Heritage Canada is seeking written feedback on the proposals.