Laws should monitor bias in AI, experts say

Artificial intelligence expert To Anh Tran said that in the wake of Europe’s privacy laws, other lawmakers must step up to minimize the concerns about bias and artificial intelligence. Tran was one of a panel of experts who spoke at the Law Society of Ontario’s event in Toronto March 6 to mark International Women’s Day.

Laws should monitor bias in AI, experts say
To Anh Tran, Helen Kontozopoulos and Valentine Goddard spoke at the Law Society of Ontario’s International Women’s Day event on March 6 in Toronto. Photo: Angelyn Francis/Law Society of Ontario

 

Artificial intelligence expert To Anh Tran said, that in the wake of Europe’s privacy laws, other lawmakers must step up to minimize the concerns about bias and artificial intelligence. Tran was one of a panel of experts who spoke at the Law Society of Ontario’s event in Toronto March 6 to mark International Women’s Day.

 

“Globally, government would have to make some effort — everyone would have to make some effort — to step up and make those types of regulations: norms, values, ethics,” said Tran.

“In the United States, we have [the Food and Drug Administration], which has a drug approval process. We need to do something similar with artificial intelligence, where we know which products we want to be accepted. . . . I’m glad that Europe has already raised the flag, as you know, with GDPR. When it comes to the privacy of the data, most of the application has to be in compliance, and if not, you don’t do the business.”

Members of the panel were Helen Kontozopoulos, a startup founder and University of Toronto professor, along with Tran, a senior technology advisor for the Canadian government. Quebec lawyer Valentine Goddard, founder and CEO of AI Impact Alliance, moderated the panel.

Law has not been immune to the rise of artificial intelligence technology, as evidenced by Toronto companies such as DealMaker, a cloud platform to help lawyers manage transactions, said Tran.  

“That platform allows you to automate an entire deal from a legal perspective. Legal fields are so risk-averse with anything that is outside of their firewall,” said Tran. “Data, signatures, electronic approvals, moving the documentation. . . . As it is today, [lawyers] are probably spending 80 per cent of their time doing administrative tasks.”

But the use of artificial intelligence in law enforcement has raised concerns about its bias, she said.

Tran pointed to a U.S. case, Wisconsin v. Loomis, which was published after the judge used a risk-assessment technology called Compas in deciding the sentence for the defendant. In that case, the defendant claimed that the use of Compas violated his right to due process, partially because “it improperly uses gendered assessments in sentencing,” but the state’s Supreme Court disagreed.

“When you check the data and you analyze on a certain population that is not the entire population, certainly, the bias is there,” said Tran.

Goddard said the Supreme Court of Canada last year also raised the issue of bias in technology in Ewert v. Canada, 2018 SCC 30. That case explored a series of tools used by Correctional Service Canada to assess the risk of recidivism. Appellant Jeffrey Ewert claimed that the set of tools was “developed and tested on predominantly non-Indigenous populations and that there was no research confirming that they were valid when applied to Indigenous persons,” the SCC’s split decision said.

Kontozopoulos said that if an algorithm is based on decisions that have been made in the past — for example, hiring decisions in a field that has been historically male dominated — there is risk that the algorithm could encourage perpetuating those trends.

Goddard also noted that lawyers may become involved in the process through the drafting of contracts that may be designed to protect intellectual property of the underlying technology but also create a “black box” where it becomes difficult to audit the algorithms to ensure fairness.

“This is impacting our lives — how we buy houses, how we buy insurance,” said Kontozopoulos. “We are putting people into these boxes, and you might get a more favourable insurance or discount than who is sitting next to you. . . . I’m training something on the information of the past to try and make things faster and easier.”