Dark patterns — a new frontier in privacy regulation

REUTERS/Kacper Pempel/Illustration

July 29, 2021 - Dark patterns have been around for quite some time, but as we continue to move toward the digitalization of society, these manipulative practices have become more complex and pervasive. Recently, the increasing use of dark patterns has caught the attention of state and federal regulators.

Earlier this year, California became the first state to pass consumer privacy legislation banning the use of dark patterns. Last month, Colorado quickly followed suit by passing its own consumer privacy legislation banning the use of dark patterns. Other states such as Washington have legislation in the works that would address dark patterns as well. "California bans 'dark patterns' that trick users into giving away their personal data," The Verge, March 16, 2021; "Why Colorado's data privacy bill may be a big mountain to climb for marketers," The Drum, June 10, 202.

At the federal level, the Federal Trade Commission has indicated interest in regulating the use of dark patterns and held a public workshop on the topic in April 2021 in an effort to evaluate this issue and its impact on consumers and look toward regulations.

While regulation of obviously harmful dark patterns is expected, we find ourselves at a new frontier in regulation due to the recent proliferation of dark patterns that has, to date, largely been unregulated, especially with respect to inadvertent or unintentional use of dark patterns.

This proliferation in usage of dark patterns has been proven to have a disparate impact on consumers, and, if unchecked, will likely exacerbate current inequities as we adapt AI and automated technologies.

Dark patterns are manipulative or deceptive practices built into user interfaces by developers that have the effect, intentionally or unintentionally, of obscuring, subverting, or impairing consumer autonomy, decision-making, or choice. Dark patterns are often carefully designed to alter decision-making by users or trick users into actions they did not intend to take.

Some examples of dark pattern usage include:

•Deception — Use of dark patterns to induce false beliefs, such as a countdown timer that is irrelevant to an offer expiring.

•Hidden Costs — Hiding fees or costs from users until they spend time and enter their information to get to a checkout page, only to find there are significant additional fees that were not disclosed up front.

•Asymmetric Presentation — Making one option (such as agreeing to information sharing) very prominent and accessible, while the other options (such as opting out of information-sharing) are difficult to locate.

•Covert Ask — Presenting an offer that asks for the consumer to provide certain data in exchange for a reward (e.g. provide their email for $25 discount), and then asks the consumer for more information (e.g. their phone number).

•Forced Continuity — A consumer signs up for a free trial and is required to enter credit card information. There is then no opt out email or reminder to cancel before the free trial is over, and the consumer automatically begins paying.

Put another way, the use of dark patterns in a user interface can manipulate the user cognitively and can cause users to purchase things, disclose data, agree to legal terms, or take other actions they did not intend to do, or can cause users to fail to accomplish what they set out to do, such as opting out of a subscription.

While some uses of dark patterns are easy to identify, such as intentional manipulation of users for the benefit of the app provider, other uses of dark patterns can be hard to identify and, in many cases may be unintentional. In fact, the realm of inadvertent or unintentional use of dark patterns is quite expansive and is where regulators will likely have their work cut out for them.

In the tech industry, it has become commonplace to measure product success through user engagement; there is even an entire discipline of "growth hacking" to maximize customer acquisition and retention. This has arguably led to a singular business focus on growth at all costs, which as a result may gloss over or even incentivize use of manipulative practices in such pursuit.

For example, the gamification of products, nudging users, split testing, and other similar techniques are all normalized in today's business practices - yet some of these practices, regardless of intentionally, have the effect of eroding user agency and could be deemed to be dark patterns.

The proliferation of dark patterns will likely be accelerated through the use of machine learning and automation technology, which will iterate through automated testing and with little to no human intervention.

This can lead to the development of algorithms and other design output that prioritize "growth" over user agency and lead to further normalization of dark patterns. Businesses that measure their success through user acquisition or engagement should carefully consider if their products are designed in a manipulative manner that would be considered dark patterns by regulators.

Studies have shown that certain groups are more susceptible to dark patterns, such as communities of color, lower income individuals, children, older adults, and other historically disadvantaged groups. These groups will typically experience more severe harm as a result of dark patterns, such as falling victim to potential threats and scams, loss of privacy, financial harm, and others. ("The Impact of Dark Patterns on Communities of Color," by Stephanie Nguyen, Data & Society: Points)

In contrast, individuals of high digital literacy are more likely to be able to identify and avoid falling trap to dark patterns. From a regulatory standpoint, in addition to protecting consumers generally from manipulation, regulators may also look to preventing discrimination as an objective so that dark patterns do not further exacerbate current social inequities.

Reining in the use of dark patterns has caught the attention of federal regulators. Under Section 5 of the FTC Act, the FTC has the authority to prosecute companies for unfair or deceptive trade practices, which it has exercised against businesses for the use of dark patterns.

Recently, the FTC brought charges against Age of Learning, Inc., which operates ABCmouse, alleging that ABCmouse made misrepresentations about cancellations and failed to disclose important information to consumers, leading tens of thousands of people to be renewed and charged for memberships without proper consent. The complaint also alleged the company unfairly billed its users without their authorization and made it difficult for consumers to cancel their memberships, preventing consumers from avoiding additional charges. The company was ordered to pay about $10 million in a settlement order and change its practices.

At the state level, both California and Colorado have passed consumer privacy legislation banning the use of dark patterns, and other states are contemplating doing the same. For California, businesses that do not comply can experience fines up to $7,500 per violation (which, if levied for usage of dark patterns that affects many users, can be devastating). For Colorado, penalties can be up to $20,000 per violation, with a maximum penalty of $500,000. As other states pass consumer privacy legislation, businesses can expect increasing liability for failure to comply.

Given the large amount of potential liability to businesses with respect to the use of dark patterns, both at the federal level and the state level, businesses should stay keenly aware of regulatory trends with respect to dark patterns. This is an emerging area of law, but recent developments suggest that federal and state level regulators will increase scrutiny on companies in order to stop manipulative practices and increase consumer choice. As more regulation comes to fruition both at the federal and state level, businesses will likely need to reexamine elements of their products, such as user interface and other designs, to ensure compliance.

Update: This article has been corrected to reflect that in the FTC action against Age of Learning Inc the $10 million that the company paid was the result of a settlement order.

Opinions expressed are those of the author. They do not reflect the views of Reuters News, which, under the Trust Principles, is committed to integrity, independence, and freedom from bias. Westlaw Today is owned by Thomson Reuters and operates independently of Reuters News.

Catherine Zhu is a data privacy and technology transactions attorney with Foley & Lardner LLP. She has advised high-growth companies on commercial and data privacy matters, including how to implement business intelligent privacy strategies within a complex regulatory environment. She can be reached at czhu@foley.com.