Data security and bias among primary concerns with AI in immigration law: Sergio Karas

Immigration Refugees and Citizenship Canada has expanded AI use in face of application backlog

Data security and bias among primary concerns with AI in immigration law: Sergio Karas
Sergio Karas, immigration lawyer

At the beginning of the pandemic, the federal government amended the Immigration and Refugee Protection Act to authorize the use of electronic and automated tools by immigration officers.

Immigration Refugees and Citizenship Canada (IRCC) uses artificial intelligence (AI) to help process applications, as the tools can quickly spot inconsistencies and issues. Immigration authorities also use AI to assist with the decision-making for visa applications by analysing an applicant’s personal information, including their work history and education.

IRCC began dabbling in AI before the pandemic, but COVID has heightened its need because application volumes have “increased dramatically,” says Sergio Karas, a certified specialist in Canadian citizenship and immigration law, who practises in Toronto.

As the use of the technology expands and develops in the immigration system, Karas says there are clear benefits and serious concerns.

On the positive side, AI and machine-learning use will expedite processing, he says. The tools speed up the work by more quickly digesting simple applications. Where a document is missing, AI can quickly identify which it is and flag that to the decision-maker. AI can also enhance the system’s integrity by detecting fraudulent applications and potential security threats. Through analyzing social media, AI can spot “red flags,” including evidence of terrorism or other criminal activity.

AI chatbots can also help new Canadians integrate and successfully settle in Canada by providing information about the job market, as well as enhance the customer experience for those in the process of immigrating by answering common questions and preventing the need to wait on hold while an IRCC call rep finds the answers.

The downside is that while AI tools are great with simple applications, they are not as good with complex ones, says Karas.

“We're not at the stage where AI can nuance the responses. A lot of the applications require in-depth analysis.”

The concerns also centre around how those algorithms are trained. Tools trained on biased data will produce biased results.

“Garbage in, garbage out. We don't know what kind of parameters these algorithms are being fed,” says Karas. “That can lead to discriminatory practices.”

This concern extends to AI tools used on the other end of the system to assist people in developing their applications. He says this could “very easily” lead to a situation where the tool is counselling misrepresentation.

Another potential problem, says Karas, concerns transparency and accountability. If people do not know how or why AI systems are arriving at certain decisions, it makes it more difficult to challenge those decisions.

As with any data-driven process, privacy is another concern. Just as the technology can spot red flags in an applicant’s social media, it can gather sensitive information about any applicant. If this data is insecure, cybercriminals could use it for malicious purposes. The capabilities of these tools are also effective for mass surveillance, he says.

Protecting data must be “paramount,” Canada must “implement clear and effective regulations to ensure that personal data is collected, stored, and used responsibly and ethically,” “that individuals have control over it,” and the data should be deleted after a certain number of years, says Karas.

With the introduction and sophistication of ChatGPT, he also wonders what automated chatbots will mean for the system.

“I worry that ChatGPT may be writing decisions, for example, on applications. Or are we going to have a robot visa officer and a robot judicial review judge? It's interesting. It opens up Pandora's box.”

Despite the concerns, Karas expects that the use of AI in Canadian immigration will continue to grow.

“I am hoping that the Office of the Privacy Commissioner is on top of all these things because we are going down a rabbit hole here, and we don't know where it leads,” he says.