Regulators increasingly using privacy law to confront deceptive online design, say lawyers

E-commerce growth has corresponded with a rise of manipulative user interface design techniques

Regulators increasingly using privacy law to confront deceptive online design, say lawyers
Danielle Dudelzak and Kirsten Thompson, Dentons

As deceptive approaches become more prevalent online, regulators are increasingly using privacy laws to target their use, say two lawyers from Dentons.

Website developers use “dark patterns,” which are manipulative and deceptive user interface design components, to coax users into unintentionally clicking on or agreeing to things that they would not have freely chosen, or that are against their best interests. This often involves tricking the user into agreeing to share more personal information than they otherwise would.

Dark patterns include default settings which maximize the collection of personal information, disguised ads, hidden costs, additional items automatically added to a user’s e-commerce cart that the user must uncheck before checking out, and “roach motels,” which are services that make it easy to sign up but nearly impossible to cancel or unsubscribe.

“We're definitely seeing dark patterns increasingly being targeted through privacy laws,” says Danielle Dudelzak, an associate in Dentons’ corporate group, with a focus on the technology, media, and communications sectors. While dark patterns have been around for over a decade, they are attracting growing attention from regulators and industry bodies because they are becoming more sophisticated and widespread, she says.

Dark patterns are an underhanded use of what is often called “choice architecture,” which means presenting or organizing choices in a way that influences decision-making.

“Choice architecture is a fancy way of saying humans are fundamentally lazy, don't like reading things, and will take the path of least resistance – or we’re easily confused,” says Kirsten Thompson, partner and the national lead of Dentons’ privacy and cybersecurity group.

Regulators in the European Union and the US are beginning to crack down, and Canada is following suit, with Quebec’s new privacy law coming to the closest to explicitly targeting these behaviours, she says.

Under Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA), it is prohibited to obtain consent through deception. Last year’s proposed privacy law reforms in Bill C-27, the Digital Charter Implementation Act, 2022, go further, says Dudelzak.

“It says that an organization can’t obtain or attempt to obtain consent by providing false or misleading information or using deceptive or misleading practices,” she says. “So, this is more of a direct nod to addressing dark patterns. It doesn't go so far as some of the US privacy legislation that actually attempts to define what dark patterns are in the legislation.

“But we are moving in that direction where we are starting to kind of get more express about what we mean by dark patterns.”

The proposed legislation states that obtaining consent through deceptive or misleading practices invalidates that consent. As Canada modernizes its privacy laws, it will more closely resemble the EU’s General Data Protection Regulation (GDPR), says Dudelzak.

Under the GDPR consent must be “freely given, specific, informed, and unambiguous.”

“Within the concept of consent are built-in these very specific types of requirements,” says Thompson. “Dark patterns and other deceptive or misleading mechanisms are not unlikely to pass scrutiny given the granularity of that requirement.”

Canada’s privacy legislation is more flexible, which was deliberate for the sake of neutrality and the ability to adapt to constantly evolving technologies, she says. Under PIPEDA, consent must be “meaningful.”

“Fundamentally, it's a little too flexible, I think, given the current digital developments.”

Quebec is attempting to address some of these issues in its new privacy law, says Thompson. The province passed Bill 64, An Act to modernize legislative provisions regarding the protection of personal information, in late 2021.

In Quebec’s legislation, “privacy by design” is the default, she says. So, users must enable data-collecting mechanisms, otherwise, they remain off.

“It's a little unclear in the current drafting and what that means exactly. But that's what we're starting to see. The consent model under the Quebec legislation also looks a lot more like what we see in the EU.”

In addition to privacy law, dark patterns also have implications for competition and consumer protection law. In the US, they have been getting a lot of consumer-protection attention from the Federal Trade Commission, says Dudelzak.

Consumer-protection laws are a “hot button issue” for class actions, adds Thompson.

“If you run afoul of consumer protection law – particularly in Quebec because it's quite stringent consumer protection law – that can land you in hot water… People don't like feeling like they've been deceived, or that they've been misled. So, there tends to be a lot of enthusiasm for pursuing legal remedies.”

The Competition Act also prohibits misleading and deceptive advertising, and the Competition Bureau has “sharp teeth, big fines, and significant powers,” says Thompson.

“The problem with this area, choice architecture, is it can be used for both good and evil. It's very difficult to regulate things that can be used in two ways, because where do you cross the line between good and evil?”

It is good when it streamlines the user experience and helps consumers move through a website and navigate pages, but evil when that involves deception and leads the consumer somewhere they did not want to go, she says.

“Regulators should avoid a prescriptive approach that attempts to define what it means,” says Thompson. “You're locked in time, and by the time you actually get a bill through the legislature, the technology has already evolved. Canada takes a good approach on the flexibility piece of it.”

Because that makes it difficult for businesses to understand what’s expected, there needs to be clear guidance about what is and is not permitted, she adds.

“My expectation would be that where the use of choice architecture tips over into misrepresentation [and] deception, we already have mechanisms to enforce that – some of which have good enforcement powers, others need to be buttressed.”