State and federal regulators have definitely put a new emphasis on tackling so-called “dark patterns” — a term attributed in 2010 to user experience expert Harry Brignull, who runs the darkpatterns.org website. Consider some of the actions of 2021: In April, the FTC held a workshop dedicated to dark patterns. In July, Colorado passed the Colorado Privacy Act which specifically defines and prohibits the use of dark patterns. In October, the FTC issued a policy statement warning against the use of dark patterns in subscription services. And just last week, a four-state bipartisan group sued Google alleging violations of state law in part for Google’s use of dark patterns to obtain consumer consent to collect information. of geolocation. But other than a catchy name, is there really anything new about the kinds of conduct that state and federal authorities call illegal? This two-part blog post will take a closer look at this question.
What are “Dark Patterns?”
There are a number of definitions of “dark patterns” floating around. Darkpatterns.org calls them “tricks used in websites and apps that make you do things you didn’t mean to do, like buy or sign up for something.” In the Colorado Privacy Act, dark models are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing the user’s autonomy, decision-making, or choice.” And in recent Google lawsuits, each state defined dark patterns as “deceptive design choices that take advantage of behavioral tendencies to manipulate users into making choices to benefit the designer and detriment to the user.” .
In other words, “dark patterns” are practices or formats that manipulate or mislead consumers into taking actions they otherwise wouldn’t take or want to take. In the first part of our analysis, we’ll take a closer look at some recent multistate actions by the State Attorney General (AG) to see if “dark patterns” are really a new concept.
Examples of recent application from State AG
In 2019, the District of Columbia and Nebraska AGs sued Marriott and Hilton respectively, alleging deception in their billing of “resort fees”. In neither lawsuit will you find the phrase “dark model,” but both cases allege that hotel chains engineered online customer flow to hide fees, impairing consumers’ ability to compare prices and affecting ultimately their ability to make an informed choice. Although these cases are still pending, the basic deception theory asserted is similar to past actions by AGs, for example in the area of subscription services, where AGs alleged that sales streams directed consumers to a subscription while failing to clearly disclose the recurring nature of the charge. is a violation of their unfair and deceptive marketing practices laws.
Last week’s Google lawsuits have a very similar feel. Many factual claims described as “dark patterns” fall squarely within a traditional deception analysis – for example, claims that Google does not properly disclose location-gathering parameters or uses misleading in-product prompts that misrepresent the need for location information or effect on product functionality. But what about some of the other factual claims found in the lawsuits, such as Google “repeatedly pushing users to turn on Google Account settings” or that Google doesn’t emphasize the benefits enough? advertising and monetary for Google to obtain location information? Indiana and the District of Columbia both allege that Google engages in an unfair practice by “using user interfaces that prevent consumers from denying Google access to and use of their location information, including making hard-to-find location-related user controls and repeatedly prompting users who had previously declined or disabled location-related controls to enable those controls.”
But although this is a dark pattern – two essential elements of these allegations are true in all the enforcement measures discussed: 1) the behavior was allegedly the result of affirmative intentional behavior during the design of the product or service, and 2) there was a necessary impact on consumers, compromising their ability to make an informed choice. In other words, it’s not just that pop-ups or even multiple notices try to to persuade consumers to make a choice. On the contrary, pop-ups and notices are designed in such a way that alters the consumer’s ability to voluntarily make this choice.
It remains to be seen whether the States will succeed in the actions mentioned here, but it is necessary not assume that their use of the phrase “dark patterns” will create a new normal under the law. Indeed, the courts will analyze the facts according to the alleged legal standard (deceit or injustice) as they have always done. Nonetheless, businesses should note that states can re-emphasize practices and formats that impede choice, and be sure to seek guidance in designing their purchase flows, cancellation methods, and other communications with consumers so as not to be subject to similar practices. allegations.
Stay tuned for part two
In Part 2, we’ll examine recent trends in FTC enforcement and whether or not “dark patterns” are creating a new normal in the federal arena.[View source.]