The EDPB sheds light on 'dark patterns' in the new Guidelines 3/2022

Author info

Who has never found themselves clicking on a huge green button underneath the privacy settings, convinced that this would validate their choices, only to realise that they had just, against all odds, accepted all the default settings? Who has never found themselves, on each visit to the same website, in front of complex cookie (sub)windows, even longer than the page visited? Or who has never given up on the idea of deleting their account on a social network at the 5th stage of the tedious process? This has probably happened to everyone, so much so that it has inspired a mini game of refusing cookies as quickly as possible.

On 14 March 2022, the European Data Protection Board ("EDPB") published the 3/2022 guidelines on dark patterns on social networks, the non-transparent practices of influencing or even forcing users to make decisions about their privacy or rights.

In these new guidelines, which are currently under to public consultation and will therefore still be subject to change, the EDPB classifies and illustrates a range of dark patterns. Although the phenomenon can in principle take place not only on social networks but also on other platforms and websites, the EDPB addresses these guidelines explicitly to social network designers, as well as to their users to enable them to better spot such practices.

Dark patterns are likely to violate various provisions of the GDPR

This assault on dark patterns comes at the crossroads of most, if not all, of the general principles of the GDPR. Indeed, a social network interface or user experience that encourages the user to make a more invasive decision about their personal data can lead to the following:

  • a lack of transparency (e.g. if the explanation text displayed contradicts itself),
  • a flawed consent (e.g. if consent is repeatedly requested) or not sufficiently withdrawable (e.g. if withdrawal requires many clicks as opposed to the consent itself),
  • a violation of the data subject's rights (e.g. if a link to exercise a right merely redirects to generic information),
  • or a violation of the principle of protection by design and by default (e.g. if the most invasive options are selected by default).

More generally, dark patterns will, as the EDPB likes to recall, lead to a breach of the fairness principle of the GDPR, which is "an overarching principle which requires that personal data shall not be processed in a way that is detrimental, discriminatory, unexpected or misleading to the data subject". Finally, all these principles are supplemented by the accountability principle, according to which it is up to the controller to demonstrate compliance. Ultimately, these new guidelines are imbued with the stated aim of the GDPR to ensure data protection for data subjects by placing them at the heart of the management and effective control of their personal data.

Furthermore, the EDPB goes so far as to draw a parallel with consumer law, pointing out that the provision of incomplete information may additionally constitute a violation of consumer law (e.g. misleading advertising).

What is the impact for social network designers and other data controllers?

These ambitious guidelines clearly have “GAFAM” and major online platforms in their sights, in line with recent interventions by the European legislator such as the Data Act or the Digital Services Act package. They seem to go one step further than previous ones, in that they do not just apply the principles of the GDPR to typical, one-off processing activities (which could be considered as "textbook cases"), but to an entire eco-system. This includes not only consent to specific activities, but also the way in which all privacy options are presented - including audience settings when the user posts content - and, more generally, the entire user experience on the social network on a day-to-day basis.

Moreover, these new guidelines require a greater balancing effort on the part of the controller. Thus, for example, the data subject must be duly informed in advance but, at the same time, a privacy policy that is too exhaustive and inundates them with information may be qualified as a Privacy Maze or a Look over there and may be contrary to the requirements of transparency, in particular as regards the concise and intelligible nature of the information provided. Similarly, consent must be collected in a specific and granular manner (and not "bundled" with other services or purposes), but the controller must be careful not to overwhelm the user with options, otherwise it will constitute a practice of Too many options. It is therefore a case-by-case and sometimes delicate examination.

Dark patterns can occur at all stages of social network use

The guidelines are structured chronologically and follow the "life cycle" of a social network account.

The dark patterns and applicable principles are thus illustrated through several use cases, starting with the registration on the platform and ending with the deletion of the account, including the provision of information at the beginning of use, the communication of a data leak, the provision of privacy parameters, and finally the exercise of rights by the person concerned.

At each of these stages, the EDPB also provides a set of best practices to help social network designers achieve compliance at the development phase.

The EDPB classification of dark patterns and its examples

Let's get to the heart of the matter: in the 64 pages of the guidelines, the EDPB distinguishes about fifteen practices that constitute dark patterns, divided into six main categories developed for the occasion. The EDPB also differentiates between content-based and interface-based dark patterns.

The table below contains all the dark patterns identified and defined by the EDPB, as well as the different examples related to them (very similar examples have been merged).

This classification is not perfectly watertight and, as privacy professionals are used to, is rather a tool to be applied on a case-by-case basis to identify risky practices. A single practice may thus correspond to several dark patterns.

Overloading

“Burying users under a mass of requests, information, options or possibilities in order to deter them from going further and make them keep or accept certain data practice.” (EDPB)

This includes:

Continuous Prompting

Repeatedly asking users to provide more data or agree with new purposes, regardless of the choice already communicated by the user. Examples:

  • Continually requesting for a phone number for security purposes (the EDPB adds that enhanced authentication can be implemented through other means).
  • Repeatedly displaying incentive pop-up encouraging users to give access to their contacts or to accept to get personalized data, especially if this blocks users each time.

Privacy Maze

Obtaining information or exercising data subjects’ rights is a “treasure hunt”, so that users will probably give up. Examples:

  • Referring, in the Privacy Policy, to the Q&A instead of directly providing the information or the direct link to it.
  • Dividing related privacy settings into several menus and sections.
  • Having to move several times from a webpage to another to find basic information such as how to exercise data subject’s rights.
  • Providing an easy way for users to download all their personal data, but disguise the link to only request access to specific data (see Art. 15(3) RGPD and our blogpost on the new guidelines on the right of access for more information).
  • Disguising the link to delete an account at the end of a Privacy page.
  • Using self-created terminology, requiring users to find the definition.

Too many options

Providing too many options to choose from, leaving users unable to make a choice. Examples:

  • Settings related to the same aspect of data protection are spread amongst multiple pages, which exposes users to too many options.

Skipping

“Designing the interface or user experience in such a way that users forget or do not think about all or some of the data protection aspects.” (EDPB)

Deceptive Snugness

By default, the most invasive features and options are enabled in an effort to take advantage of the default effect. Examples:

  • The setting “share this information with everyone” is pre-selected instead of a more restrictive setting.
  • The setting “pause this account” is pre-selected instead of “delete this account”.

Look over there

Providing irrelevant or unnecessary information to distract users from their initial intent. Examples:

  • In a data breach notification, a controller mentions many non-relevant information, or states that data were hashed although it was only the case for passwords.
  • A controller describes cookies and collect consent thereto with humour, thus misrepresenting the potential risks.
  • In the process of deleting their account, users are proposed to follow a link to download their data, but are then not taken back to the deletion process; Or the website asks the reason for leaving and displays pop-ups with ad-hoc solutions.

Stirring

“Affecting the choice users would make by appealing to their emotions or using visual nudges.” (EDPB)

Emotional Steering

Using reassuring or negative words or images to influence the user's emotional state and prevent them from making a rational decision. Examples:

  • The use of motivational language or exclamation marks to incentive users to share more data (e.g. with a tone that create a sense of urgency or sound like an imperative).
  • Stating, when an account is about to be deleted by a user, that their friends will miss them.

Hidden in plain sight

Using a visual style for information or data protection controls that nudges users towards less restrictive options. Examples:

  • Using a tiny icon hidden in a webpage as a link to the privacy policy.
  • The link to reject is far less visible that the link to accept, or the link enabling users to skip a step is so small that they will not notice it and assume the step is mandatory.
  • Users must guess that an element (e.g. a privacy setting) is clickable and therefore changeable.
  • It is needed to hover on a word to see a relevant link.

Hindering

“Hindering or blocking users in their process of obtaining information or managing their data by making the action hard or impossible to achieve.” (EDPB) 

Dead End

An information is impossible to find because a link is either not working or not available at all. Examples:

  • Users are informed that they can change their choice at any time by visiting the privacy policy page, but the policy does not explain in concrete terms how to withdraw consent.
  • The privacy policy contains, on certain aspects, only general statements that it is possible to access more information, without further explanations.
  • Nothing visually happens when unticking a box to withdraw consent.
  • A link to exercise a right redirects to the main profile page.
  • A privacy policy mentions third-party recipients without providing links to their respective policies.

Longer than necessary

It takes more steps to disable privacy-invasive options than to enable them. Examples:

  • Users are asked “are you sure?” when they skip a screen to share additional data.
  • When deleting data or an account, the page prompts not only for confirmation (which can be seen as a security measure) but also for the reason for the deletion.
  • Opting-in to targeted advertising only requires one click, but not opting out.

Misleading information

There is a discrepancy between information given and the actions available. Examples:

  • Users expect that a link will take them to a mobile application download, but they are required to fill in their phone number to receive the direct link instead.
  • Users click on a link to withdraw consent but is redirected to another website explaining what consent is.

Fickle

“The design of the interface is unstable and inconsistent, making it hard for users to figure out where the different controls really are and what the processing is about.” (EDPB)

Lacking hierarchy

Information related to data protection is presented several times in several ways. Examples:

  • In a privacy policy, the right to lodge a complaint is stated in a different section than the other rights.
  • A 70-page privacy policy does not contain any section or headlines.
  • The audience parameters for a post are always presented in a certain order, except for one type of post where it is reversed.
  • The privacy settings icons are presented inconsistently between the desktop and mobile versions of the social media.

Decontex-tualising

A data protection information or control is located on a page that is out of context. Examples:

  • Privacy settings must be accessed through the “security” page.
  • The link to deactivate or delete an account is placed under the “Your data” or “Delete a function of your account” section.
  • The “save” button of the privacy settings is not sufficiently visible, so that users assume that the settings have been saved automatically.

Left in the dark

“The interface is designed in a way to hide information or controls related to data protection or to leave users unsure of how data is processed and what kind of controls they might have over it.” (EDPB)

Language discontinuity

Information related to data protection is not provided in the official language(s) of the country where users live, whereas the service is.

Examples:

  • Unlike the rest of the platform, information about data protection is either not available in the user's language, or another language is displayed by default.

Conflicting information

The pieces of information contradict each other in some way. Examples:

  • It is stated that the privacy settings can be changed before posting and, in the next sentence that the setting will be available after posting.
  • The controller states that it suffered from a data breach, but then states that the breach originated from a third party, and/or declares the severity of said breach in relation to itself rather than to the data subject.
  • The green and red colours and/or positions of a toggle switch for changing a setting are reversed.

Other aspects that deserve attention 

  • The EDPB once again confirms the prohibition to deny access to a service in case of refusal of consent, and thus its position on the illegality of cookie walls.
  • A point that is likely to be debated during the public consultation is the sending of a code by SMS message for security reasons. The guidelines seem to be reluctant to use this method of two-factor authentication, which is currently considered by experts to be less secure than other methods. Some will nevertheless point out that it is probably better to have this option activated than not to have a second factor feature at all. Even so, the EDPB sees the very use of this technique, and thus the collection and use of the phone number (which is “not that easily interchangeable”), as a potential violation of the minimisation principle, since alternatives exist. One can speculate that this is because of the finding that some controllers use security reasons to collect telephone numbers and then use them for other purposes. In such cases, however, the purpose limitation principle applies, so that use for other purposes would in any case be prohibited.

    Incidentally, it is questionable whether this issue is really about dark patterns and whether it belongs in these guidelines.

  • Although the EDPB repeatedly refers to, and even encourages, the use of "?" signs or icons providing more information (which is not surprising, as this method of visual information is promoted by the GDPR), he emphasises that the text displayed upon clicking or hovering should indeed contain more detailed information, not superfluous information or prompts.
  • More generally, although the GDPR rules invoked in the guidelines apply to data controllers in all sectors, the EDPB has chosen to focus on social media platforms and their users only. Is this a convenient first step in an enforcement strategy - in which case all other sectors and websites will eventually be targeted - or does it mean that social networks have a greater responsibility - and if so, on what basis? It is probably true that social networks pose a relatively high risk to privacy, but some platforms in other sectors pose as much or more risk.

Want to know more?

The guidelines (version 1.0) were adopted in March 2022 and the public consultation will remain open until 2 May 2022. Do you want to stay up to date on this topic? Follow us on LinkedIn!

Do you have a specific question or would you like support in this matter? We will be happy to help you. Book a free 15-minute call with Janvier at janvier.lawyer.brussels(for organisations only).