Dark patterns – deceptive design patterns and privacy
Dark patterns (deceptive design patterns) is an umbrella term covering a variety of practices used in digital interfaces (User Interface, UI) that lead users to make decisions that are not aligned with their actual interests or intentions.
What are dark patterns?
The term was introduced in 2010 by designer Harry Brignull and refers to design elements that subtly, but systematically, influence user behaviour – most often to the benefit of the service provider, at the expense of user privacy or control over personal data. In practice, dark patterns often appear in how cookie consent is designed or how an account registration goes or how easy it is to excerise user’s rights under the GDPR.
Dark patterns may occur, among others, in:
- social media platform interfaces,
- cookie banners,
- online stores,
- mobile applications,
- video games and microtransaction systems,
- registration forms and newsletters.
In the context of personal data protection, they are particularly problematic because they may lead to:
- unintentional sharing of excessive amounts of data,
- giving consent without real freedom of choice,
- granting unnecessary permissions,
- difficulties in exercising rights under the GDPR (e.g. withdrawing consent or deleting an account).
How can privacy-related dark patterns be grouped?
Source materials point to many categories of dark patterns, some of which overlap in meaning. The classification below uses a simplified, functional approach, focusing on the effect on the user and on data protection. It is not a closed list, and assessing legality always requires analysis of the specific case.
Forced action
What it means: the user has no real possibility of using the service without taking a specific action – most often registering an account or consenting to broad data processing.
Examples:
- No “Reject” button in a cookie banner.
- A message such as: “By using this site, you agree to the Privacy Policy” – without any alternative.
- A message such as: “Create an account to complete your purchase” (forced account registration).
Hiding information and settings
What it means: information about data processing or privacy settings is difficult to find, scattered, or only accessible after multiple steps.
Examples:
- During registration, the user receives a link to the Privacy Policy, but the document is not clearly visible on the landing page and must be searched for within additional content.
- In the privacy policy, specific information is not located where a typical user would expect it, or is spread across multiple documents.
- The option to withdraw marketing consent exists, but is hidden under:
Account settings → Preferences → Communication → Partners → “Manage” → only there a list of multiple toggles appears, without a “reject all” option and without explaining the consequences.
More intrusive privacy settings by default
What it means: the most privacy-intrusive options are enabled by default, and the user must actively change them to limit data processing.
Examples:
- A pre-ticked marketing consent checkbox during registration.
- When entering a date of birth, multiple sharing options exist, but the default is set to “public.”
- Ad personalisation enabled automatically.
Misleading language and misinformation
What it means: the use of ambiguous, contradictory or manipulative wording that makes it harder to understand the consequences of a decision.
Examples:
- Double negatives such as: “Do not uncheck this box if you do not want to stop receiving important information from us.”
- Buttons labelled “OK” instead of a clear “Accept.”
- General statements such as: “Your data may be used to improve our services,” without explaining what data, for what purpose, and how.
Emotional manipulation
What it means: applying emotional pressure through language, visuals, colours or messaging to push users toward more intrusive choices.
Examples:
- A message when unsubscribing from a newsletter: “We’re sad to see you go.”
- Two options when asking for newsletter sign-up, where one is framed positively and the other negatively:
“Sign me up! I want to know when I save money!”
“No thanks, I prefer to pay full price.” - A pop-up encouraging users to provide additional, non-essential data: “We want to get to know you better! We can’t wait, so don’t hesitate and tell us more about yourself!”
- “Enable personalisation so we can provide you with the best experience” + a large “Enable” button, while refusal appears as a small “Skip” link (without explaining that it may involve, for example, ad profiling).
Hard to opt out (asymmetry – easy in, hard out, roach motel)
What it means: registration or consent is easy, but withdrawing consent, cancelling or deleting an account is significantly more difficult.
Examples:
- No option to delete an account in the settings.
- Requirement to contact customer support by phone.
- Multi-step subscription cancellation processes.
Repeated prompting
What it means: repeated requests for consent or data intended to “wear down” the user and push them to change their initial decision.
Examples:
- A newsletter pop-up keeps reappearing despite being closed.
- A request for specific data (e.g. a phone number) appears on every login until the user provides it.
- Closing the window only results in a “Remind me later” option, without a “Never ask again.”
Choice overload
What it means: presenting too many options or settings in a way that makes it difficult to make an informed choice.
Examples:
- Complex cookie panels without explanations.
- Multiple equal options without indicating consequences.
- Lack of supporting information (e.g. what a cookie category means, how long it lasts, who receives the data).
Keeping users in the dark
What it means: providing information in a vague, overly general way or in a language not adapted to the user.
Examples:
- A platform is available in Croatian or Spanish, but key data protection information is only in English.
- Help pages automatically switch language based on location, despite the user previously choosing a different language.
- A privacy policy statement such as: “Your data may be used to improve our services” – without specifying what data and how it will be used.
Distraction
What it means: privacy-related elements are presented alongside something else (often urgent or attractive), causing the user to lose focus and make a decision “along the way” instead of consciously setting preferences.
Examples:
- In a cookie banner, next to consent options there is a large “Continue” button, while cookie settings are hidden under a “Learn more” link.
- A privacy settings window appears at the same time as a request for notifications: the user clicks “Allow” thinking it is required to use the service.
- A mobile app requests geolocation “for better recommendations” while simultaneously displaying a message about “new features,” with a prominent “Enable” button
Dead end
What it means: the user looks for information or control (e.g. deleting an account), but the path leads nowhere – the link does not work, the option does not exist, or it leads to a help page without a solution.
Examples:
- An option “Delete account” exists in settings, but clicking it leads to a help article saying “Contact us” without a working form.
- A “Manage consent” link in the footer leads to a 404 page or to a general privacy policy without a settings panel.
- The user tries to withdraw marketing consent – there is a toggle, but after saving, the setting reverts to its previous state (no effect, no feedback).
Are dark patterns illegal?
Dark patterns are primarily unfair and misleading practices toward users and their privacy. Many of them may also be considered unlawful under various regulations, in particular:
- GDPR – principles of fairness, transparency and data minimisation (Art. 5), conditions for consent (Art. 4 and 7), obligations to facilitate the exercise of rights (Art. 12–22).
- Digital Services Act (DSA) – prohibition of designing online platform interfaces in a misleading or manipulative way (Art. 25(2)).
- ePrivacy Directive (2002/58/EC) – including consent for cookies and similar technologies (Art. 5(3)) and rules on direct marketing in electronic communications (Art. 13).
In practice, if interface manipulation concerns, for example, consent to data processing, its legality will primarily be assessed under GDPR (and often ePrivacy in the context of cookies), regardless of the general prohibition of dark patterns under the DSA.
In the EU, there have already been decisions, fines and formal actions related to mechanisms described as dark patterns. For example, in 2024 the Polish competition authority (UOKiK) imposed a fine of over PLN 31 million on Amazon, among others for practices classified as dark patterns (time-pressure mechanisms without guaranteed delivery), and the European Commission issued preliminary findings against platform X concerning dark patterns and advertising transparency (DSA).
In February 2026, the European Commission presented preliminary findings indicating that TikTok may be in breach of the DSA due to “addictive design” (including infinite scroll, autoplay, push notifications and highly personalised recommendation systems).
At the same time, work is ongoing at EU level on the Digital Fairness Act, which is expected to address, among others, dark patterns, addictive design and unfair personalisation.
For users – how to protect yourself?
- Read privacy policies and consent notices.
- Pay attention to the absence of real alternatives (e.g. no “Reject” option).
- Regularly review permissions and settings.
- Use privacy tools (e.g. browser settings, extensions).
- Report unfair practices to regulators.
For businesses – good practices
- Design interfaces in line with privacy by design and privacy by default principles.
- Ensure symmetry: easy to give consent – easy to withdraw it.
- Use clear, simple language and logical information structure.
- Conduct regular UX and GDPR compliance audits.
- Involve data protection experts at the design stage, not after implementation.
This article refers to the Polish legal framework and is based on the legal status, case law and practice applicable at the time of publication. The scope and manner of personal data processing may differ between countries and depend on local laws.
Sources and further reading:
Digital Services Act (DSA), https://eur-lex.europa.eu/eli/reg/2022/2065
ePrivacy Directive (2002/58/EC) Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector (Directive on privacy and electronic communications)
Dark Commercial Patterns, OECD, No. 336, October 2022, https://www.oecd.org/en/publications/dark-commercial-patterns_44f5e846-en.html
EDPB Guidelines 03/2022 on deceptive design patterns in social media interfaces, https://www.edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-032022-deceptive-design-patterns-social-media_en
UOKiK decision (Amazon, 2024) https://uokik.gov.pl/31-mln-zl-kary-dla-amazon
Commission preliminary finds TikTok’s addictive design in breach of the Digital Services Act, Komisja Europejska, Luty 2026, https://digital-strategy.ec.europa.eu/en/news/commission-preliminarily-finds-tiktoks-addictive-design-breach-digital-services-act
Digital Fairness Act – Legislative Train Schedule, EU Parliament, https://www.europarl.europa.eu/legislative-train/theme-protecting-our-democracy-upholding-our-values/file-digital-fairness-act
European Commission – preliminary findings on platform X, https://ec.europa.eu/commission/presscorner/detail/pl/ip_24_3761