Online Nudges and Dark Patterns.
- Eddie Reilly
- Sep 20, 2021
- 4 min read
Updated: Feb 20, 2022

Nudge theory was first described and developed by behavioural economist Richard H Thaler. Fundamentally, nudges are subtle ways that people can be manipulated into making certain choices. There can be noble reasons behind these nudges, such as getting a motorist to slow down in a built-up area, by painting the lines on the road closer together to create an illusion of speed in the mind of the driver. Nudges, of course, could also be used for not-so-benign motives, such as selling more of a product than a consumer would have otherwise bought.
Where deceptive or manipulative techniques are being commonly employed on websites to get individuals to make choices that they would otherwise not make, they fall into this category. The term coined over ten years ago for these practices was dark patterns.
How many times have we been confronted with the choice between ‘allow all cookies’, normally presented in an attractive shade of green, and the manage cookie settings button, often in a less attractive shade of grey? There are multiple variations on this, although sometimes we get a simpler option to accept all cookies or reject all.
Design architecture will consider that people wanting to access a website will often just accept all cookies, in order to speed up the process, instead of the slightly more arduous task of choosing essential or non-essential cookies for most websites visited. Cookie consent notices are ever-present after The General Data Protection Regulation came into force. In many cases, users will just choose to accept all cookies, to avoid the hassle of configuring cookie settings.
Even though the website might be complying with its legal obligations for cookie consent notices under the General Data Protection Regulation, there are clearly not so subtle aspects incorporated into the design architecture to ‘nudge’ a user quite forcibly down specified ‘desirable’ paths set out by the provider. Desirable, of course, means beneficial to the service provider first.
The cookie consent button is just one of many examples of what are called dark patterns. This term goes back to 2010 when UX designer Harry Brignull set up a website called darkpatterns.org to collect and catalogue examples in use online. The hall of shame section of the site is still posting some of the more egregious examples of dark patterns. A recent example from July 13th describes the process of cancellation of a subscription to @The Economist. The option of cancelling the membership can only be done by live chat with a sales agent who naturally is motivated to convince the user to maintain their membership.
It is often the case that giant corporations such as Facebook, Google, and Microsoft use default settings for users to facilitate the maximum collection of user data. Privacy settings that work to the user’s advantage are usually hidden somewhere in the settings. Individuals are often nudged into choices that they would otherwise not make, that offer optimum benefit to the provider, whether it is collecting data or getting the consumer to buy a specific subscription plan. If the default settings intentionally intrude on a user’s privacy or if misleading wording deliberately pushes an individual to make a sub-optimal decision, it is classified as a dark pattern. The ethical problems in this manipulative design architecture are clear, as companies have a vested interest in attracting as many users as possible and of collecting as much user data as is they can.
In essence, some online platforms are participating in a type of social engineering. Finagling users to behave in a manner that is often contrary to their own interests clearly falls into this category. To agree to things that they normally would not if they understood the nature and consequences of those choices
The ‘only one room left at this price’ on hotel booking sites, or the mention of friends’ names who will miss you if you permanently delete your Facebook account are examples of blatant emotional manipulation. The first pressures the user to buy the room now and the second conveys the idea that deleting a Facebook account is a form of virtual suicide, where we are dying for our online friends. An often-quoted example of a classic dark pattern in action is seen in the difference between signing up for an Amazon Prime account, which is designed to be very straightforward, and the more complex process of deleting the same account.
New regulations being amended to the California Consumer Privacy Act would make the above-mentioned example illegal. The same simplicity should be incorporated into cancelling an online subscription as to signing up. Fundamentally, the use of confusing or ambiguous language or making some processes unnecessarily complex will come under the new regulations. Where a dark pattern is identified, the company will be given a period of 30 days to remedy it.
Where data subjects are exercising their data privacy rights online, or are deciding on what information is shared with an online platform, the use of misleading tactics or confusing language will contravene the amended regulations. The number of steps required of a user to permit the sharing of certain personal data with an online service will have to be the same as the number of steps to deny permissions.
Essentially, the consequence of this is that consent will not be recognised if gained through the use of dark patterns.
In The European Union, The General Data Protection Regulation does not explicitly mention dark patterns; however, it is clear in Article 4(11) of The GDPR where consent is defined as follows:
‘Consent of the data subject means any freely given, specific, informed, and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’.
Irrespective of protections that can be benefited from in California, or where cases of invalid consent mechanisms are taken in Europe, it is not clear how design architecture on websites will improve in the short term, especially where no regulations exist. It would not be hugely difficult to remove the most obvious examples of dark patterns, especially relating to privacy consents or default pre-ticked boxes. It is not outside the realms of the imagination to envisage UX designers coming up with more innovative dark patterns to get around new restrictions being implemented by law.






Interesting