Hallucinations and exposure of proprietary information among generative AI’s workplace legal issues

AI's productivity boost will cause 'shifts and tremors,' says employment lawyer Ed Matei

Hallucinations and exposure of proprietary information among generative AI’s workplace legal issues
Ed Matei, Peninsula Canada

The use of generative AI in the workplace is raising legal issues due to hallucinations and the input of proprietary and other sensitive information, says Ed Matei, an employment lawyer at Peninsula Canada.

Workers use ChatGPT for writing speeches, memos, and emails. They use it to edit, amend grammatical errors, and enhance clarity. And they are also using it to generate ideas by conversing with the tool to dip into its vast well of information and stimulate creativity. ChatGPT, the chatbot that uses generative AI to respond to user queries, launched last November and reached 100 million monthly users by February, making it the fastest-growing consumer app in history, according to Reuters.

A recent survey of US workers found that 28 percent of respondents used ChatGPT at work regularly. The Reuters/Ipsos poll found that only 22 percent said their bosses had explicitly permitted ChatGPT use, and another 10 percent had expressly forbidden it.

Generative AI tools such as ChatGPT are going to support, to some degree, any creative industry, says Matei. That is the best-case scenario. He says the worst case is that, rather than supporting workers, these tools will slim down the workforce.

“In comparison to somebody that works eight hours a day, requires benefits, requires time off, requires an HR department – obviously, if you have a piece of software that is capable of putting out the same product or even a comparable product that is marginally inferior, that could be a problem.”

“From an employer’s perspective, the second one can be far more attractive.”

In a Goldman Sachs report from earlier this year, the bank predicted that generative AI could automate 44 percent of legal tasks in the US.

Generative AI is not yet at the point where it is replacing jobs; it is merely a tool that can multiply productive output, says Matei. The legal issues he sees arising in the workplace include hallucinations, where it receives a query and invents an answer with no factual foundation.

A prominent example of this bug is the lawyer from New York City who used ChatGPT for legal research and was provided with six cases that the chatbot made up. Steven Schwartz’s client had been injured during a commercial flight, and the airline had argued that the lawsuit had been filed outside the limitation period. He searched for cases that supported the position he had filed on time, and when opposing counsel searched the citations, they found nothing. Schwartz, who had not double-checked ChatGPT’s work before filing his submission, then had to admit that the cases did not exist.

Another legal issue Matei sees in workplaces is employees submitting proprietary information into a generative AI-powered tool, which risks the exposure of sensitive information and trade secrets. That also could create serious problems for the employee responsible, he says.

Workplace ChatGPT use involves consumer protection risks. Businesses must inform consumers that they are interacting with ChatGPT rather than a human representative and must disclose when providing documents that the AI has produced. Failure to do so can result in claims of unfair or deceptive practices, and businesses should prioritize transparency and disclosure when using ChatGPT with customers, according to a news release from Peninsula Canada.

Matei’s work includes crafting and implementing internal policies to help businesses succeed. This includes regulating generative AI use in employment contracts and establishing next steps when issues arise.

Peninsula suggests that employers mitigate ChatGPT’s risks with comprehensive training on its appropriate use and limitations and by setting clear policies and guidelines by including provisions explicitly prohibiting the disclosure of trade secrets and confidential and proprietary information in employment contracts. Peninsula recommends secure communication protocols, encryption measures, and regular audits and monitoring. The company said employers should also have data detection and retention policies and execute legal and compliance reviews.

“Maybe I'm a bit of a technology optimist,” says Matei. “I think we're at the precipice of having an entirely new, very useful tool to be able to massively increase output….I certainly think that there may be shifts and tremors coming as a consequence. But ultimately, I hope that this is no different of a tool than anything else. Obviously, time will tell.”

Recent articles & video

Roundup of law firm hires, promotions, departures: July 15, 2024 update

SCC reinforces Crown's narrow scope to appeal acquittal

Final changes to competition laws will require more sophisticated merger analysis: Blakes lawyers

Ontario Court of Appeal upholds paramedics' convictions over death of shooting victim

BC Court of Appeal upholds class action certification in Capital One data breach case

BC Supreme Court awards damages for chronic pain and mental health issues from car accident

Most Read Articles

BC Supreme Court dismisses applications seeking personal liability of estate executor

BC Supreme Court upholds trust company's estate administration amid beneficiary dispute

Alberta Court of Appeal reinstates sanctions on naturopathic doctor for unprofessional conduct

Government of Canada publishes a report to tackle anti-black racism in the justice system