Illustration by Tracy Dai
I’m sure you’ve been here before: you get an email, and it’s a newsletter you don’t remember signing up for, about a product you don’t remember subscribing to updates about. You say, “hmm, well, I’ll just unsubscribe.” You scroll down to the bottom of the email, find the unsubscribe button, and you get sent to a website where you click through a few pages that say ‘are you sure you want to unsubscribe?…Are you sure?…Are you really, really sure?” Since you’re really, really sure, you click a button and boom: Done. Unsubscribed. But then the next week the newsletter or product reappears in your inbox. Didn’t you unsubscribe? What happened?
You probably encountered a dark pattern.
A dark pattern is a digital design pattern that intentionally or unintentionally tricks a user into making a decision or choice. Sometimes a dark pattern can be an obfuscation of information – for example, clicking on an unsubscribe button only to be sent to a new page to click unsubscribe again, using slightly different language. It can be finding a security feature on a social network buried underneath multiple settings, making it neither easy or immediately surfaceable. But why talk about dark patterns now? Over the past few years there’s been a resurgence of interest in them, from lawmakers trying to pass bills to outlaw dark patterns to academic researchers finding over 11,000 in e-commerce websites to even ProPublica. As designers, we know that design can radically affect society, from accessibility in products (and legislation around that like the ADA), from creating new forms of technology and infrastructure or from the fact that design literally touches every facet of our lives. But policy makers don’t necessarily consider how design affects technology or directly impacts society.
In the policy world, there’s an understanding that technology can have negative influences on society, but the role that design plays in those technologies is often unrecognized. And this is what brings us to dark patterns – because a dark pattern can trick users in the context of their use of that technology; As found by a Princeton study, in an ecommerce setting they can suppress prices or nudge users to buy higher priced things. In another example, ProPublica found that a dark pattern hid the free version of TurboTax from applicants who qualified for it.
Dark patterns out in the wild
Dark patterns, as manipulative, accidental, and strange as they can be, actually serve as a great metaphor for policymakers to really begin to understand the impact design can have. A dark pattern, like those described above, are obvious to even non-designers; it’s easy to understand the harm they cause. This focus is something I call “design policy,” which is the understanding of how design affects policy, and that design can be a tool to explain what technology does.
From 2019 to 2020, I collaborated with Stiftung Neue Verantwortung, a policy think tank based in Berlin, to write a paper and research the policy implications of dark patterns, and to determine how to explain to policy makers the importance of design in policy, especially around dark patterns. In the European Union, dark patterns are to be specific kinds of threats; they create friction and hide choices that affect users’ agency or right to privacy. So, for the first time, lawmakers are realizing how important design is. One thing for non-European designers to keep in mind is that Europe has passed a lot of different regulations to protect consumer privacy, and generally has more stringent laws on the rights for users and their data, unlike the U.S. and the rest of the world. This means Europe may be the place where we see the first successful regulation passed on something like dark patterns, given their history of consumer protection of privacy in technology. When it comes to limiting the harms of technology, right now Europe is the one leading that charge to protect users’ data.
For lawmakers, dark patterns provide names and tangible reasons for occurrences that they’ve noticed within technology but had no grounding for understanding or dissecting. Our paper argues how dark patterns create the erosion of personal privacy in digital products, by obfuscating or hiding privacy filters and settings. One could argue that specific design choices make it more difficult to protect personal data. In addition, especially in Europe, dark patterns could systematically weaken the European Union’s privacy regulations because they undermine the principle of individual consent by hiding, accidentally, unintentionally, or through poor design, these privacy choices. In particular, we can look at Facebook’s more in-depth privacy settings.
Another example is the “Netzwerkdurchsetzungsgesetz” (NetzDG) – a German law that passed in 2017, which required social networks to update report flows for new forms of unlawful content, like hate speech online. However, the design of NetzDG makes it difficult for users to find the reporting interstitial and then report offensive content. Our paper argues this considerably reduces the regulatory effect of the NetzDG, and is a dark pattern, regardless of intention. Did Facebook want NetzDG to be inefficient? Most likely not, but for a dark pattern one could argue intention doesn’t matter since the effect is the same – of hiding content, creating confusion, or nudging users to make a specific choice.
The future of design and legislation
Lastly, dark patterns could potentially violate non-compete laws that exist in Europe. Could a dark pattern give one digital store an advantage over competitors by means of manipulative or misleading design tactics? This is an open question that needs to be explored, especially by policy makers. This means that policy makers need deeper expertise and need to collaborate with researchers that understand product and interface design.
Not since the passing of the ADA has there been such an intersection of design and legislation as there’s been with dark patterns.
What’s important to realize is that, as designers, our work has a massive impact – it can harm, just as much as technology itself does.
It’s not just technology that creates harm, like in the case of poor facial recognition, biased AI, or the myriad of other issues. Design works with technology to make technology understandable to all users. Tech regulation already exists in Europe and may be coming to the United States, and I don’t think design regulation is that far behind. That means, as designers, the impact of what we do has never been greater, and it means our accountability is about to get even more serious. Responsible and ethical movements exist in technology, with standards and suggestions; the study of dark patterns and the awareness legislators have could create a bigger movement around design in the same way.