Bill also clarifies difference between de-identified and anonymized information: Chantal Bernier
The federal Liberal government’s new proposed privacy legislation includes enhanced protection of the personal information of minors.
Minister of Innovation, Science and Industry François-Philippe Champagne and Justice Minister and Attorney General David Lametti introduced Bill C-27, the Digital Charter Implementation Act, 2022 on Thursday. The legislation includes three proposed acts: the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act. The bill will toughen Canada’s private sector privacy regime, facilitate the responsible development of AI, and further implement the Liberal government’s Digital Charter, said the announcement from Innovation, Science and Economic Development Canada (ISED).
The Digital Charter Implementation Act follows the proposed privacy legislation of the same name which was introduced in 2020 but never became law. Chantal Bernier, former Interim Privacy Commissioner of Canada and current member of Dentons’ Canadian privacy and cybersecurity practice group, says Bill C-27 shows the government’s responsiveness to the feedback elicited by the previous legislation, Bill C-11.
“In a very balanced fashion, they had to reconcile competing interests and comments,” she says. “And I am very impressed about how they did listen, and they do show a real concern to preserve that balance.”
Bill C-27 would bolster protections for minors by limiting the rights an organization has to collect and use their personal information and would apply a higher standard for the organizations’ handling of that information, said ISED’s announcement.
How Bill C-27 deals with children’s personal information is “notable” and “well done,” says David Fraser, a privacy lawyer and partner at McInnis Cooper in Halifax. It classifies children’s personal information as “sensitive information,” requiring additional protection and diligence.
“It recognizes that children and information about children is a special category and that children have inherent vulnerabilities.”
The bill also gives children autonomy with their personal information. It states: “The rights and recourses provided under this Act may be exercised (a) on behalf of a minor by a parent, guardian or tutor, unless the minor wishes to personally exercise those rights and recourses and is capable of doing so.”
That means minors capable of making decisions about the collection, use or disclosure of their personal information, who understand the consequences and to what they are consenting are allowed to consent, says Fraser.
“It's more akin to, for example, medical consents, where if you are a young person who is capable of understanding the consequences of your actions and the consequences of a decision, you're able to make that decision,” he says.
A lack of explicit protection for minors was a one of the gaps in the current privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), says Wendy Hulton, partner at Dickinson Wright PLLC in Toronto whose practice includes data privacy and cybersecurity.
But Hulton says Bill C-27 does not define minors. “As somebody who has been to federal court and argued with Health Canada on the definition of a child, I can tell you that that's going to be a real bone of contention.” That issue should be addressed during consultation, she says.
Classifying all information collected from minors as sensitive information means that organizations collecting such information will require express consent, says Bernier. They will also need to provide clear policy notices adapted to the age of the audience and will be required not to retain that information longer than necessary.
“Every rule about protecting privacy applies at a higher level in relation to sensitive information and therefore, categorically to any minor’s information,” she says.
Another area in Bill C-27, which is a “great improvement” on C-11, is the clarification of the difference between de-identified and anonymous information, says Bernier. “Now, it's clear. De-identified information is where the information can be relinked to the individual. Anonymized information is where that has been made impossible.”
Bill C-11 conflated the two information types and subjected anonymized information to privacy-law restrictions, which hampered organizations’ ability to innovate and research and analyse trends. Anonymized information, which is not personal information and does not refer to anyone, is no longer within the Act and organizations are free to use it, she says.
“That clarification is absolutely key to organizations.”
The proposed Artificial Intelligence and Data Act is intended to enhance Canadians’ trust in the development and deployment of AI. The bill will ensure AI is built and used in a manner which “identifies, assesses and mitigates the risks of harm and bias,” said ISED. Bill C-27 creates an AI and Data Commissioner, who will support the Minister of Innovation, Science and Industry to fulfill their responsibilities under the Act. And the bill proposes criminal offenses for AI developed via unlawful data collection, reckless deployment posing serious harm, or where deployment involves fraudulent intent to cause economic loss.
The Artificial Intelligence and Data Act will require impact assessments to determine whether an AI system relates to high-risk processing and whether it will negatively impact human rights, says Fraser. If it is deemed a “high-risk activity,” guardrails will need to be erected and in cases where that activity will harm a person’s physical well-being or property, it will be prohibited, he says.
The impact assessments are “absolutely critical,” says Hulton. “At our firm, I'm the co-chair of our diversity committee, and I can't tell you how much concern I have with some things that I'm reading about these AI programs.”
The Artificial Intelligence and Data Act does not apply to products, services or activities under the direction of the Minister of National Defence, the Director of the Canadian Security Intelligence Service, the Chief of the Communications Security Establishment, or “any other person who is responsible for a federal or provincial department or agency and who is prescribed by regulation.” This means law enforcement and national security agencies are “able to deploy harmful AI as much as they want,” says Fraser.
“That carve out is pretty glaring, in my view,” he says.