I just wanted to read a news story. But before I could, an ad popped up obscuring the web page. The ad had a big green button labeled “Find Out More.” And in the upper right corner, there was an X. But clicking the X didn’t close the ad, as I expected, it took me to the advertiser’s site. Angry, I went back to the news page. The ad was still there. Finally, I found the button I needed – “No Thanks” in faint, tiny grey type.
That experience convinced me that I needed to go elsewhere for my news. If I can’t trust a site to do the right thing with their user interface, how can I trust their content?
I’m sure you’ve had similar experiences. They’re annoying, but also informative. They prove that how much you trust a site, an app, or any other experience isn’t just about lawyers and privacy policies. For most users, trust is viscerally about design. Does a site make it easy to see your choices and make it clear what the consequences of each choice will be? Or does it make some options (the ones that aren’t beneficial to the company) hard to find or try to trick you into clicking something you don’t really want?
In much of the world, in fact, that’s not just good design practice, it’s the law. The European Union’s General Data Protection Regulation (GDPR) requires that sites provide information about how they’ll use data in a “concise, transparent, intelligible and easily accessible form, using clear and plain language.” That’s a design challenge if I ever heard one.
These issues of trust and responsibility and how they affect and involve interaction design are the subject of the second annual World Interaction Design Day (IxDD) on Sept. 24, presented by Adobe and IxDA. I’ll be joining an event in New York City, but there are events happening all over the world. If you’re interested in interaction design, find an event near you and join the conversation.
One of the best trends of the past few years is the growing recognition of the importance of design. More and more companies are recognizing that design is a competitive advantage and putting designers at the center of product development. That gives designers more control and influence. But, as any Spider-Man fan will tell you, with great power comes great responsibility.
In this case, the designer’s responsibility is to stand up against the inevitable pressures to produce a design that helps the company, but hurts the users. Pressures to create the kind of hostile ad design I mentioned earlier, to obscure the extent of the data the site is collecting or what it’s doing with that data, to give just a few examples.
Designers should use their new influence to advocate for the interests of users. We talk a lot now about design-centric development. Design-centric development should also always mean user-centric development.
Fighting for users’ interests can be lonely at times, but there are resources to help. If you can’t get to an IxDD live event, there is also a series of videos that addresses these topics. In the latest episode of the Wireframe podcast, my colleague Khoi Vinh convenes a roundtable of design experts to discuss privacy, trust and responsibility. And at Adobe, we’ve worked with the educational nonprofit Simply Secure to produce a UI kit that helps design a site that protects users’ privacy and security.
If you’re a designer advocating for the users’ interests, there’s a good chance you’ll run into someone at your company who’ll say some version of “I understand what you’re saying, but we need to bend the rules this once to make this initiative work/make our numbers for this quarter/make the boss happy/etc.” But those short-term gains aren’t worth the risk to your company’s reputation. There are too many examples to cite: when a company loses its customers’ trust, bad things follow. So when the going gets tough, remind yourself that looking out for your customers is the same as looking out for your company. And keep on fighting.