Dark patterns in data protection

May 8, 2023

In a time where fake news is so prevalent, the topic of dark patterns has gained a lot of relevance. The term was purely academic but, in recent times, has been commonly seen in data protection regulations worldwide.

Indeed, this term was coined in 2010 by Harry Brignull, a British UX designer, who even created a website on the topic. He also describes it as deceptive design, a term equivalent to dark pattern.

Dark patterns are interface designs that attempt to trick, coerce, or pressure users into taking certain actions, for example, buying or signing up for something. To this end, the strategies used in such design may contain unequal alleged benefits in the available choices or even false, misleading or hidden statements, inducing inappropriate choices or behavior contrary to private data protection laws.

Dark patterns are much more common than one would think and often practiced by large companies, either national or multinational. For example, they may use a popup consent form on their website with a button saying “Accept all cookies” for the user to click on. Such prerogative occurs without the option for the user to reject or even choose which Cookies are acceptable.

Another example often found on newspaper and magazine websites are paid advertising materials, in which misleading advertising boosts sales of a given product by noting outlandish, untrue characteristics.

Harry Brignull’s website lists several types of dark patterns, which are transcribed below:

TYPE

DESCRIPTION

Comparison prevention

The user struggles to compare products because features and prices are combined in a complex manner, or because essential information is hard to find.

Confirmshaming

The user is emotionally manipulated into doing something that they would not otherwise have done.

Disguised ads

The user mistakenly believes they are clicking on an interface element or native content, but it's actually a disguised advertisement.

False scarcity

The user is pressured into completing an action because they are presented with a fake indication of limited supply or popularity.

Fake social proof

The user is misled into believing a product is more popular or credible than it really is, because they were shown fake reviews, testimonials, or activity messages.

False urgency

The user is pressured into completing an action because they are presented with a fake time limitation.

Forced action

The user wants to do something, but they are required to do something else undesirable in return.

Hard to cancel

The user finds it easy to sign up or subscribe, but when they want to cancel they find it very hard.

Hidden costs

The user is enticed with a low advertised price. After investing time and effort, they discover unexpected fees and charges when they reach the checkout.

Hidden subscription

The user is unknowingly enrolled in a recurring subscription or payment plan without clear disclosure or their explicit consent.

Nagging

The user tries to do something, but they are persistently interrupted by requests to do something else that may not be in their best interests.

Obstruction

The user is faced with barriers or hurdles, making it hard for them to complete their task or access information.

Preselection

The user is presented with a default option that has already been selected for them, in order to influence their decision-making.

Sneaking

The user is drawn into a transaction on false pretenses, because pertinent information is hidden or delayed from being presented to them.

Trick wording

The user is misled into taking an action, due to the presentation of confusing or misleading language.

Visual interference

The user expects to see information presented in a clear and predictable way on the page, but it is hidden, obscured or disguised.

The issue is viewed with great concern and seriousness by authorities responsible for consumer law and private data protection. In March 2022, the European Data Protection Board (EDPB) launched the Guidelines on deceptive design patterns in social media platform interfaces, which teaches how to recognize and avoid dark patterns.

These guidelines clarify that the General Data Protection Regulation (GDPR) relies on the principle of fair treatment established in Article 5 (1) (a). It serves as a starting point for assessing whether a design does indeed constitute a “dark patten design”. Other principles in this assessment are those of transparency, minimization of data, and responsibility under the terms of (c) and (2) of the same Article, as well as the purpose limitation under (b). Furthermore, in other cases, the legal assessment may also be based on conditions of consent in Article 4 (11) and (7) or other specific obligations, such as Article 12.

It is not a coincidence that, in January 2022, Facebook and Google were severely punished with considerable fines within the European Union for using dark pattens in their cookies consent process, which were considered confusing and difficult to understand. Other examples of defining and regulating unclear standards can be found in US laws, most notably the California Consumer Privacy Act (CCPA), the California Privacy Rights Act (CPRA) and the Colorado Privacy Act (CPA).

While the CCPA does not explicitly mention dark patterns, the concept emerged during its regulatory process. In the Final Statement of Reasons, a document produced to accompany draft regulations, the California Attorney General stated that: “It would run counter to the intent of the CCPA if websites introduced choices that were unclear or, worse, employed deceptive dark patterns to undermine a consumer’s intended direction.” The CPRA, on the other hand, defines dark standards such as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” Finally, the CPA defines dark patterns standards as a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, a definition almost identical to that of the CPRA. The CPA takes effect in July 2023, while the CPRA takes effect in January of the same year.

As such, five principles that must be observed by companies are introduced:

  1. Ease of understanding – should be applied to both the language used and the design itself.
  2. Symmetry in choice – exercising a “more privacy” option should not be any longer or more difficult than a “less privacy” option.
  3. No confusing language or coercive interactive elements – Double negatives are a clear example of confusing language, i.e. placing “Yes” or “No” options next to “Do not sell or share my personal information”.
  4. No manipulative language or choice architecture – for example, offering a discount or other financial incentive in exchange for consent is considered manipulation.
  5. Easy to Run – There should be a fully functional, built-in consent managing tool that can facilitate opt-outs without adding friction. Other examples to avoid are circular links or inoperative and non-functional e-mail addresses.

Previous Post

There is no previous post

Back to all posts

Next Post

There is no next post

Back to all posts

RECENT POSTS

LINKEDIN FEED

Newsletter

Register your email and receive our updates

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

FOLLOW US ON SOCIAL MEDIA

Newsletter

Register your email and receive our updates-

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

FOLLOW US ON SOCIAL MEDIA

Licks Attorneys' Government Affairs & International Relations Blog

Doing Business in Brazil: Political and economic landscape

Licks Attorneys' COMPLIANCE Blog

Dark patterns in data protection

No items found.

In a time where fake news is so prevalent, the topic of dark patterns has gained a lot of relevance. The term was purely academic but, in recent times, has been commonly seen in data protection regulations worldwide.

Indeed, this term was coined in 2010 by Harry Brignull, a British UX designer, who even created a website on the topic. He also describes it as deceptive design, a term equivalent to dark pattern.

Dark patterns are interface designs that attempt to trick, coerce, or pressure users into taking certain actions, for example, buying or signing up for something. To this end, the strategies used in such design may contain unequal alleged benefits in the available choices or even false, misleading or hidden statements, inducing inappropriate choices or behavior contrary to private data protection laws.

Dark patterns are much more common than one would think and often practiced by large companies, either national or multinational. For example, they may use a popup consent form on their website with a button saying “Accept all cookies” for the user to click on. Such prerogative occurs without the option for the user to reject or even choose which Cookies are acceptable.

Another example often found on newspaper and magazine websites are paid advertising materials, in which misleading advertising boosts sales of a given product by noting outlandish, untrue characteristics.

Harry Brignull’s website lists several types of dark patterns, which are transcribed below:

TYPE

DESCRIPTION

Comparison prevention

The user struggles to compare products because features and prices are combined in a complex manner, or because essential information is hard to find.

Confirmshaming

The user is emotionally manipulated into doing something that they would not otherwise have done.

Disguised ads

The user mistakenly believes they are clicking on an interface element or native content, but it's actually a disguised advertisement.

False scarcity

The user is pressured into completing an action because they are presented with a fake indication of limited supply or popularity.

Fake social proof

The user is misled into believing a product is more popular or credible than it really is, because they were shown fake reviews, testimonials, or activity messages.

False urgency

The user is pressured into completing an action because they are presented with a fake time limitation.

Forced action

The user wants to do something, but they are required to do something else undesirable in return.

Hard to cancel

The user finds it easy to sign up or subscribe, but when they want to cancel they find it very hard.

Hidden costs

The user is enticed with a low advertised price. After investing time and effort, they discover unexpected fees and charges when they reach the checkout.

Hidden subscription

The user is unknowingly enrolled in a recurring subscription or payment plan without clear disclosure or their explicit consent.

Nagging

The user tries to do something, but they are persistently interrupted by requests to do something else that may not be in their best interests.

Obstruction

The user is faced with barriers or hurdles, making it hard for them to complete their task or access information.

Preselection

The user is presented with a default option that has already been selected for them, in order to influence their decision-making.

Sneaking

The user is drawn into a transaction on false pretenses, because pertinent information is hidden or delayed from being presented to them.

Trick wording

The user is misled into taking an action, due to the presentation of confusing or misleading language.

Visual interference

The user expects to see information presented in a clear and predictable way on the page, but it is hidden, obscured or disguised.

The issue is viewed with great concern and seriousness by authorities responsible for consumer law and private data protection. In March 2022, the European Data Protection Board (EDPB) launched the Guidelines on deceptive design patterns in social media platform interfaces, which teaches how to recognize and avoid dark patterns.

These guidelines clarify that the General Data Protection Regulation (GDPR) relies on the principle of fair treatment established in Article 5 (1) (a). It serves as a starting point for assessing whether a design does indeed constitute a “dark patten design”. Other principles in this assessment are those of transparency, minimization of data, and responsibility under the terms of (c) and (2) of the same Article, as well as the purpose limitation under (b). Furthermore, in other cases, the legal assessment may also be based on conditions of consent in Article 4 (11) and (7) or other specific obligations, such as Article 12.

It is not a coincidence that, in January 2022, Facebook and Google were severely punished with considerable fines within the European Union for using dark pattens in their cookies consent process, which were considered confusing and difficult to understand. Other examples of defining and regulating unclear standards can be found in US laws, most notably the California Consumer Privacy Act (CCPA), the California Privacy Rights Act (CPRA) and the Colorado Privacy Act (CPA).

While the CCPA does not explicitly mention dark patterns, the concept emerged during its regulatory process. In the Final Statement of Reasons, a document produced to accompany draft regulations, the California Attorney General stated that: “It would run counter to the intent of the CCPA if websites introduced choices that were unclear or, worse, employed deceptive dark patterns to undermine a consumer’s intended direction.” The CPRA, on the other hand, defines dark standards such as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” Finally, the CPA defines dark patterns standards as a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, a definition almost identical to that of the CPRA. The CPA takes effect in July 2023, while the CPRA takes effect in January of the same year.

As such, five principles that must be observed by companies are introduced:

  1. Ease of understanding – should be applied to both the language used and the design itself.
  2. Symmetry in choice – exercising a “more privacy” option should not be any longer or more difficult than a “less privacy” option.
  3. No confusing language or coercive interactive elements – Double negatives are a clear example of confusing language, i.e. placing “Yes” or “No” options next to “Do not sell or share my personal information”.
  4. No manipulative language or choice architecture – for example, offering a discount or other financial incentive in exchange for consent is considered manipulation.
  5. Easy to Run – There should be a fully functional, built-in consent managing tool that can facilitate opt-outs without adding friction. Other examples to avoid are circular links or inoperative and non-functional e-mail addresses.
No items found.