Who has never found themselves clicking on a huge green button underneath the privacy settings, convinced that this would validate their choices, only to realise that they had just, against all odds, accepted all the default settings? Who has never found themselves, on each visit to the same website, in front of complex cookie (sub)windows, even longer than the page visited? Or who has never given up on the idea of deleting their account on a social network at the 5th stage of the tedious process? This has probably happened to everyone, so much so that it has inspired a mini game of refusing cookies as quickly as possible.
On 14 March 2022, the European Data Protection Board ("EDPB") published the 3/2022 guidelines on dark patterns on social networks, the non-transparent practices of influencing or even forcing users to make decisions about their privacy or rights.
In these new guidelines, which are currently under to public consultation and will therefore still be subject to change, the EDPB classifies and illustrates a range of dark patterns. Although the phenomenon can in principle take place not only on social networks but also on other platforms and websites, the EDPB addresses these guidelines explicitly to social network designers, as well as to their users to enable them to better spot such practices.
This assault on dark patterns comes at the crossroads of most, if not all, of the general principles of the GDPR. Indeed, a social network interface or user experience that encourages the user to make a more invasive decision about their personal data can lead to the following:
More generally, dark patterns will, as the EDPB likes to recall, lead to a breach of the fairness principle of the GDPR, which is "an overarching principle which requires that personal data shall not be processed in a way that is detrimental, discriminatory, unexpected or misleading to the data subject". Finally, all these principles are supplemented by the accountability principle, according to which it is up to the controller to demonstrate compliance. Ultimately, these new guidelines are imbued with the stated aim of the GDPR to ensure data protection for data subjects by placing them at the heart of the management and effective control of their personal data.
Furthermore, the EDPB goes so far as to draw a parallel with consumer law, pointing out that the provision of incomplete information may additionally constitute a violation of consumer law (e.g. misleading advertising).
These ambitious guidelines clearly have “GAFAM” and major online platforms in their sights, in line with recent interventions by the European legislator such as the Data Act or the Digital Services Act package. They seem to go one step further than previous ones, in that they do not just apply the principles of the GDPR to typical, one-off processing activities (which could be considered as "textbook cases"), but to an entire eco-system. This includes not only consent to specific activities, but also the way in which all privacy options are presented - including audience settings when the user posts content - and, more generally, the entire user experience on the social network on a day-to-day basis.
The guidelines are structured chronologically and follow the "life cycle" of a social network account.
The dark patterns and applicable principles are thus illustrated through several use cases, starting with the registration on the platform and ending with the deletion of the account, including the provision of information at the beginning of use, the communication of a data leak, the provision of privacy parameters, and finally the exercise of rights by the person concerned.
At each of these stages, the EDPB also provides a set of best practices to help social network designers achieve compliance at the development phase.
Let's get to the heart of the matter: in the 64 pages of the guidelines, the EDPB distinguishes about fifteen practices that constitute dark patterns, divided into six main categories developed for the occasion. The EDPB also differentiates between content-based and interface-based dark patterns.
The table below contains all the dark patterns identified and defined by the EDPB, as well as the different examples related to them (very similar examples have been merged).
This classification is not perfectly watertight and, as privacy professionals are used to, is rather a tool to be applied on a case-by-case basis to identify risky practices. A single practice may thus correspond to several dark patterns.
“Burying users under a mass of requests, information, options or possibilities in order to deter them from going further and make them keep or accept certain data practice.” (EDPB)
Repeatedly asking users to provide more data or agree with new purposes, regardless of the choice already communicated by the user. Examples:
Obtaining information or exercising data subjects’ rights is a “treasure hunt”, so that users will probably give up. Examples:
Too many options
Providing too many options to choose from, leaving users unable to make a choice. Examples:
“Designing the interface or user experience in such a way that users forget or do not think about all or some of the data protection aspects.” (EDPB)
By default, the most invasive features and options are enabled in an effort to take advantage of the default effect. Examples:
Look over there
Providing irrelevant or unnecessary information to distract users from their initial intent. Examples:
“Affecting the choice users would make by appealing to their emotions or using visual nudges.” (EDPB)
Using reassuring or negative words or images to influence the user's emotional state and prevent them from making a rational decision. Examples:
Hidden in plain sight
Using a visual style for information or data protection controls that nudges users towards less restrictive options. Examples:
“Hindering or blocking users in their process of obtaining information or managing their data by making the action hard or impossible to achieve.” (EDPB)
An information is impossible to find because a link is either not working or not available at all. Examples:
Longer than necessary
It takes more steps to disable privacy-invasive options than to enable them. Examples:
There is a discrepancy between information given and the actions available. Examples:
“The design of the interface is unstable and inconsistent, making it hard for users to figure out where the different controls really are and what the processing is about.” (EDPB)
Information related to data protection is presented several times in several ways. Examples:
A data protection information or control is located on a page that is out of context. Examples:
“The interface is designed in a way to hide information or controls related to data protection or to leave users unsure of how data is processed and what kind of controls they might have over it.” (EDPB)
Information related to data protection is not provided in the official language(s) of the country where users live, whereas the service is.
The pieces of information contradict each other in some way. Examples:
Incidentally, it is questionable whether this issue is really about dark patterns and whether it belongs in these guidelines.
Do you have a specific question or would you like support in this matter? We will be happy to help you. Book a free 15-minute call with Janvier at janvier.lawyer.brussels(for organisations only).