Dark Patterns: A New Legal Standard or Just a Catchy Name? (Part One)
State and federal regulators have definitely put a new emphasis on combatting so-called “dark patterns” – a term attributed in 2010 to user-experience expert Harry Brignull, who runs the website darkpatterns.org. Consider some of the actions of 2021: In April, the FTC hosted a workshop dedicated to dark patterns. In July, Colorado passed the Colorado Privacy Act that specifically defines and prohibits the use of dark patterns. In October, the FTC issued a policy statement warning against the use of dark patterns in subscription services. And just last week, a bipartisan group of four states sued Google alleging in part violations of state law for Google’s use of dark patterns in obtaining consumers’ consent to collect geolocation information. But other than a catchy name, is there really anything new about the types of conduct that state and federal officials are calling illegal? This two-part blogpost will take a closer look at that question.
What are “Dark Patterns?”
There are a number of definitions of “dark patterns” that are bandied about. Darkpatterns.org calls them, “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.” In the Colorado Privacy Act, dark patterns are defined as, “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” And in the recent Google lawsuits, each State defined dark patterns as, “deceptive design choices that take advantage of behavioral tendencies to manipulate users to make choices for the designer’s benefit and to the user’s detriment.”
In other words, “dark patterns” are practices or formats that manipulate or mislead consumers into taking actions they would not otherwise take, or want to take. In part one of our analysis, we’re going to take a closer look at a couple of recent State Attorney General (AG) multistate actions to see whether “dark patterns” is really a new concept.
Examples from Recent State AG Enforcement
In 2019, the District of Columbia and Nebraska AGs sued Marriott and Hilton respectively alleging deception in their charging of “resort fees.” In neither suit will you find the phrase “dark pattern,” but both cases allege that hotel chains designed the online customer flow to obscure fees, impairing consumers’ ability to comparison shop and ultimately affecting their ability to make an informed choice. While these cases are still pending, the basic deception theory asserted is similar to past AG actions, for example in the subscription service space, where AGs alleged that sales flows that steer consumers into a subscription while failing to prominently disclose the recurring nature of the charge is a violation of their unfair and deceptive trade practice laws.
Last week’s Google lawsuits have a very similar feel. Many of the factual allegations described as “dark patterns” fall cleanly in a traditional deception analysis – for example allegations that Google fails to adequately disclose location collection settings or uses misleading in-product prompts that misrepresent the need for location information or the effect on the functionality of the product. But what about some of the other factual allegations found in the lawsuits, such as Google, “repeatedly ‘nudging’ users to enable Google Account settings” or that Google fails to sufficiently emphasize the advertising and monetary benefits to Google of obtaining location information? Indiana and the District of Columbia both allege that Google is engaging in an unfair practice by “employing user interfaces that make it difficult for consumers to deny Google access to and use of their location information, including making location-related user controls difficult to find and repeatedly prompting users who previously declined or disabled location-related controls to enable those controls.”
But despite calling it a dark pattern – two core components of these allegations hold true in all the enforcement actions discussed: 1) the conduct was allegedly the result of affirmative intentional conduct in designing the product or service, and 2) there was a necessary impact on consumers, impairing their ability to make an informed choice. In other words, it isn’t just that pop-ups or even multiple notices try to persuade consumers to make a choice. Rather, the pop-ups and notices are designed in a way that impairs the consumer’s ability to voluntarily make that choice.
It remains to be seen whether the States will be successful in the actions discussed here, but one should not assume that their use of the phrase “dark patterns” will create a new standard under the law. Indeed, courts will analyze the facts under the legal standard alleged (deception or unfairness) just as they always did. Nevertheless, companies should take note that States may be putting a renewed emphasis on practices and formats that undermine choice, and be sure to seek counsel in designing their purchase flows, cancellation methods, and other consumer communications so they don’t subject themselves to similar allegations.
Stay Tuned for Part Two
In part two, we’ll look at recent FTC enforcement trends and whether or not “dark patterns” are creating a new standard in the federal arena.
Subscribe to Kelley Drye’s Ad Law Access blog here.
Tags: Dark Patterns