Dark patterns are as old as the internet itself. For over 30 years, the web has served as a breeding ground for manipulative design—and we’ve been attempting to stop it for nearly as long.
In an impressive act of foresight, Rolf Molich and Jakob Nielsen of the Nielsen Norman Group started creating a list of 10 Usability Heuristics for User Interface Design as early as 1990, including the core trait that dark patterns omit: User control and freedom.
Little did they know that over the next few decades, that lack of control—which ranked third on their top 10 list—would be discussed on design blogs and conference stages in perpetuity. In 2010, the moniker “dark patterns” would be coined by user experience expert Harry Brignull, and by 2020, the concept of dark patterns would become so mainstream that Google’s latest home page redesign, featuring ads that were nearly indistinguishable from organic search results, would be criticized across media outlets worldwide.
Yet despite all the attention they receive, dark patterns continue to proliferate. We regularly click buttons that say, “No, I don’t like healthy food” just to dismiss an annoying pop-up, and we navigate through the labyrinth of turns it takes to cancel an account that we never intended to create in the first place.
Why are we adapting instead of revolting?
The role of design
The natural conclusion of many of these discussions is to look to designers; because they operate at the intersection of user and business needs, they have the power to keep both sides in balance. Articles encourage designers to push back on OKRs and speak out against features that don’t align with their values, painting them as referees in this game of tug-of-war, instead of the rope.
“Ten years ago, I was all about designers regulating themselves, but it’s clear that we’re all struggling to keep up,” says user experience expert Harry Brignull.
Dark patterns have gotten worse, not better. We need to ask why.
Harry Brignull, User Experience Design
Kat Zhou, the founder of Design Ethically, identified a similar need during her time at IBM, “I created Design Ethically because we kept talking about how the tech industry needed to be more ethical, and I kept asking, ‘Okay, but what does that really mean? We can give it lip service, but how do we put that into practice?’” To start with, her answer took the form of workshops and activities that reshaped the design process. The amount of attention and resources attributed to fighting dark patterns has increased exponentially over the past decade, as Zhou’s initiative concluded, but “There is simply too much emerging technology for designers to keep up on their own,” she says.
If you’ve ever watched an episode of Black Mirror, you’ve witnessed this in action (albeit on the extreme side). “Facial recognition and augmented reality have brought technology beyond the screen,” Brignull points out, turning our own faces into a platform for the next generation of dark patterns.
Similarly, artificial intelligence is difficult to control on the design level. If we don’t know what data the system is taking in or how the algorithm is sifting through it, it’s essentially a black box. “That’s exactly how companies get away with creating biases,” Zhou says.
Taking it to the top
Unsurprisingly, discussions about dark patterns nearly always take a dark turn, but in reality we’re beginning to fight them in a way that is more proportionate to their growing threat.
“As individuals, especially in large organizations, we see a lot of bureaucracy. A lot of red tape. Decisions come from the executive level, making it hard to speak out. But at the same time, there’s no guarantee that change will come from the top,” Zhou says. Instead, she advocates for a multi-pronged approach that begins at the grassroots level—finding allies, starting petitions—and goes all the way to Capitol Hill.
“Within your company, you may be able to zoom out and find resilience in numbers,” she says. If you’re concerned about a misleading interaction, chances are your product manager, engineers, researchers, support agents, or marketing counterparts are, too. Sometimes, expressing your collective stance is enough of a red flag to topple the strategy. Other times, it may take a petition to make your way up the chain of command. But at a certain point, change has to come not only from within companies, but also across the industry. “When you open a bank account, that’s regulated. When you buy meat from the supermarket, that’s regulated. It’s simply necessary in industries that are prone to abuse, especially ones that hold as much power as technology does,” Brignull says.
We’re already beginning to see movement in this direction. Most people are now familiar with GDPR, the General Data Protection Regulation that sparked worldwide updates to companies’ privacy policies and home pages. More legislation has followed in its wake. Last year, a bipartisan bill known as the Deceptive Experiences To Online Users Reduction Act (DETOUR, for short) was brought to the U.S. Senate. The bill, which addresses deceptive practices on the internet, A/B testing, and more, was widely celebrated by ethical design advocates as a step forward.
Now that the economies of entire countries pale in comparison to the revenues generated by many tech giants today, it adds up that governing bodies may be their best match. With checks and balances intact, the role of the designer can be refocused on the Herculean task of designing interstitials that ask for consent in a user-friendly way, instead of debating whether the company should be collecting cookies in the first place.
The road ahead
Regulation, of course, has a long way to go. In the United States, many elected officials still don’t seem to grasp how companies like Facebook make money in the first place—and when protective legislation does make its way into an interface, the required language is riddled with legalese, and the surrounding interstitial is comically disruptive. “It’s promising to see that both parties are making an effort,” Zhou notes. “But as we all know, governments are not perfect.”
While the wagon of bureaucracy catches up with tech’s move-fast mentality, Brignull says, “Now is the time to get ahead of regulations.” He recommends that teams implement their own principle-based practices while you have the time to design them thoughtfully, instead of waiting until you’re up against legal deadlines and hefty fines. “Laws tend to look like the complete opposite of a good design process, but they can always be interpreted differently. There’s nothing stopping us from adhering to regulations in a future-facing way.”
For example, once Harry saw that the as-yet-unpassed DETOUR Act proposed in-house user advocacy groups, his company set one up on their own. The committee includes representatives from both inside and outside of the design organization who sign off on proposed A/B tests in addition to new features and products. This way, no matter the results of the test, the outcome will have users’ best interests in mind.
Of course, as much as we try to protect them, users play an unmistakable part in this fight, too. Perhaps an unintended benefit of having countless social platforms at our fingertips is that it’s easier than ever to call out (and in some cases, cancel) companies with unethical design practices. Digital literacy has quickly become essential to survival—or sanity, at least.
“We now base our reality on the information that companies like Google and Facebook decide to surface,” Zhou says. “They’ve become, in some sense, the arbiters of truth. So when the difference between an organic search result and a paid ad is too subtle, reality is determined by the highest bidder. It’s a big deal, and it plays a big part in shaping what people think about the world,” whether you’re deciding how to answer a homework question or who to vote for in the next election.
The stakes are high, and the scope is growing. As we choose our battles, designers may know better than anyone how difficult it will be to make a dent in dark patterns—how nuanced these interactions are, how good we’ve gotten at refining them, and how important it is to do so. By expanding our grassroots efforts all the way to the top, and leaning into the power of not only individuals, but also entire companies, industries, and legislative bodies, we may just be able to wrangle back control after all.