Facial-recognition tech deployed on Canadian streets would mean ‘annihilation of anonymity:’ CCLA
The RCMP violated the Privacy Act when it acquired and used Clearview AI facial recognition technology, according to a special report to Parliament by Canada’s Privacy Commissioner.
Clearview allowed the RCMP to match photos of Canadians with the more than three billion images the company had scraped from the internet without user consent, said the Office of the Privacy Commissioner of Canada (OPC). Clearview’s online extraction represents a “clear violation of the Personal Information Protection and Electronic Documents Act (PIPEDA).” Collecting this personal information, which Clearview obtained unlawfully, put the RCMP in contravention of s. 4 of the Privacy Act, said the OPC.
“Police aren't allowed to use surveillance tools, if the information that feeds, creates or forms the basis for that tool was unlawfully acquired,” says Dr. Brenda McPhail, director of the Privacy, Technology and Surveillance Program at the Canadian Civil Liberties Association (CCLA).
Latest News
Otherwise, “It would mean that those bodies, that are duty bound to enforce our laws, would have a loophole that would allow them to break them, sort of at a whim,” she says.
The report alleges the RCMP repeatedly misled the OPC about Clearview AI, first denying its use of the technology, then claiming it was limited to identifying, locating and rescuing children who have been victims of online sexual abuse. The RCMP told the OPC it had made 78 search queries with Clearview, but the company’s records showed it had made 521. While the RCMP claimed the gap amounted to multiple searches of the same individual, the OPC said the records showed that explanation could not account for 85 per cent of the RCMP’s searches. Only approximately six per cent of the searches appeared to relate to identifying child victims of online sexual abuse.
In the RCMP’s response, the law enforcement agency said the OPC’s report and recommendations will guide them in the adoption of new and emerging technologies.
“We acknowledge that there is always room for improvement, and we continually seek opportunities to strengthen our policies, procedures and training,” said the RCMP’s statement. “The RCMP has accepted all of the recommendations of the OPC and has already begun efforts towards their implementation.”
“The RCMP continues to agree that further engagement is needed pertaining to, not just facial recognition, but all biometric analysis that could be used to support criminal investigations. Technologies will continue to evolve rapidly, and with the prevalence of digital media, automated searching and comparison tools are likely to become increasingly useful and available to law enforcement agencies.”
Though the OPC is “encouraged” by the RCMP’s response and commitment to implementing its recommendations, it expressed concern that the law enforcement agency disagreed with the finding it violated the Privacy Act. The RCMP takes the position it is not obliged to ensure third-parties from which it collects personal information have acted lawfully with respect to that information, said the OPC.
If the RCMP’s perspective prevailed, a “whole range of potentially malicious actors” would be incentivized to acquire information, either legally or illegally, to develop a product to sell police, says McPhail, who got a PhD studying the social impacts of technology at the University of Toronto.
“It's troubling that they disputed the interpretation of the Privacy Act,” she says. “… And it's, of course, troubling that you've got Canada's national police body making an internal interpretation of the law and then acquiring very invasive technologies on the basis of that interpretation, without adequate reviewer oversight.”
On top of the OPC’s recommendations, McPhail says she would like to have seen an additional transparency requirement to provide assurance to the public the RCMP will follow through on the recommendations. Given the report shows the RCMP had made commitments it failed to execute in a timely manner, subsequent oversight or review by the OPC is important, she says.
The investigation and report illustrate what the CCLA and other civil society groups have long said about the need to keep the laws up to date with the technology, says McPhail.
“Our privacy laws were written at a time when facial recognition technology didn't exist. When you were talking about a file, you meant something that you put a piece of paper into and stored in a cabinet.”
“Even with a technologically neutral law, at some point, you need to update the laws to account for these new affordances and new capacities of evolving technologies,” she says.
Especially with biometric technologies, which deal with sensitive personal information, the public needs to think about how best to regulate these tools so Canadian and democratic values are upheld, says McPhail. But first, the public should evaluate whether these technologies should even be used at all, she says.