I’m a design researcher studying trust patterns in design. Building on research around trust and our own findings from user interviews and surveys, my team is studying communications and apps that engender or weaponize trust in technology. Trust is complex — it’s contextual, contractual, and cultural — and everyone has a slightly different definition. Understanding trust is key to creating ethical, well-designed technology that works for the common good. Read on to learn more about trust patterns in design, and if you’re a designer, take our survey and contribute to the research!
Let’s talk about trust
“What do you think of when you hear the word ‘trust’ and how do you define ‘trust’?”
I love to ask the above question at dinner parties, coffeeshops, or as a general icebreaker because the answers are revealing and fascinating. Trust seems to be more of a concept than a word, a bit like ethics, where people seem to inherently know what it is but have a problem defining it. After all, trust is a part of all relationships, and it’s a thing we’re taught through time, but it doesn’t really have to be defined. In a way, trust can be a bit like breathing — we just do it, but, like a sudden coughing fit, we are keenly aware when it goes wrong.
I’m a design researcher who works at the intersections of policy and technology, specifically on online harassment and when technology goes bad. Design makes technology and policy inside of an app, its infrastructure or social network, understandable to a user. Design is incredibly important. Thus, as a design researcher, I firmly believe in designing for the common good. But designing for social good, the public good, and the common good, on-top of making sure products utilize harm reduction, is incredibly difficult.
As a part of my research for a fellowship with Harvard University’s Kennedy School of Government, I’ve been looking at design patterns and behaviors around trust in social networks. Inspired by dark patterns research, I’m curious if there are design elements that can engender or weaponize trust in technology. With my research assistants Vandinika Shukla and Elyse Voegeli, we are conducting user interviews and surveys with journalists and technologists to study trust design patterns in social networks. Journalists are the one user group that faces every single form of online harassment. What platforms they use, and the negotiations they make to stay on platforms, are incredibly important to study. For designers, we need to understand the threats that different groups face, like journalists, and how design is complicit in facilitating and mitigating these harms.
Our observations so far
How do we define trust? For the purposes of our research, we are using a definition of ‘trust’ coined by social psychologist Julian Rotter. Rotter defines trust as “…an expectancy held by an individual or group that the word, promise, verbal or written statement of another individual or group can be relied upon…” In essence, trust is relational, and we are trying to test that in our survey. Do we trust our apps to do X (a relational experience)? A relational experience with an app means that it does what it says it will do, and delivers adequately on that promise.
When I read this definition, trust in design immediately clicked. In a simple and straightforward way, it explains trust in all of my relationships, both offline and online, with people and with the technology itself. I trust my friends to care about me, and in the event that they hurt me, I trust our relationship to work it out. I have (past tense) trusted social networks to protect my passwords or emails from being leaked. I have trusted that my data will be used in the way that was articulated to me and if it was used by a third party, I could opt out (Cambridge Analytica, I’m looking at you). When my data is misused, or a friend and I fight and it’s unsolvable, the trust is broken because the expectation of the contract I had was broken.
This is an interesting time in our society to start analyzing and unpacking trust. There’s a lot of disparate research out there, especially in this era of fake news and political unrest. According to a Pew Research report from May 2018, Americans trust tech but feel conflicted:
“74% of Americans say major technology companies and their products and services have had more of a positive than a negative impact on their own lives….When presented with several statements that might describe these firms, a 65% majority of Americans feel the statement ‘they often fail to anticipate how their products and services will impact society’ describes them well – while just 24% think these firms ‘do enough to protect the personal data of their users…’ Roughly half the public (51%) thinks they should be regulated more than they are now.”
For the most part, Americans seem conflicted but optimistic about the role of technology and social networks. This doesn’t quite define trust but it does highlight the incredibly complex feelings users have for technology, and for our relationship in using technology. Trust is hard.
So, what is trust? There isn’t one set definition but a variety. Georgia Bullen, executive director of Simply Secure says, “Trust is rooted to culture, there is structure and norms. The idea of trust varies across cultures. It’s culturally normative.” Much like design, there isn’t a one-size-fits-all solution or definition when designing for trust. Eileen Wagner of Simply Secure mentions the UI of security design, ratings systems, and financial transactions and how that interacts with all of the highly sensitive information demanded by a host of websites. For example, “there is always a checkbox when you now want to save credit card information with certain things, and what happens before and right after that is very important to whether or not I will tick that checkbox to save my credit card information on the website.”
In a handful of different industry articles, consistency is often mentioned in designing for trust — consistent patterns, consistent design, and consistent asks. Transparency is brought up and so is simplicity of language; users need to know exactly what is going on and what is going to happen in the app or technology they are using. The way to think about ‘trust’ can be this: are you asking something in a simple and straightforward way and are you delivering that to a user? Let’s put this into practice: when asking for a user’s location or email, are you clear and upfront with why you need it, and how it will be stored? When asking for anything from the user, are you leaving out any details? These disclosures aren’t just transparent but they are pathways and frameworks to building a trust-based relationship with your user.
Within Silicon Valley and corporate tech, trust is a big theme. Even Airbnb has written articles and delivered TED talks on this subject. It’s easy to talk about trust at a high-level, as we see in the AirBnB post, but very little goes into how Airbnb really designs for trust, even with a high level focus on their reputation system. But, what is trust in Airbnb, how do they create it? Customer service is service design, communication and UX design is having it easier for a customer to contact or flag Airbnb, and the service design again is how prompt the response is and what kind of response a customer gets. It is contractual (per our definition) and it’s consistency, reliability in design, and transparency in terms of governance and customer service.
This feels particularly aligned with designer Ashleigh Axios‘s definition of trust, which she describes as “a combination of reliability, accountability, privacy and kind of integrity combined. So, what that means is when folks or companies say that they’ll do something, they follow through. They’re consistent in their words, and actions, and reactions…”This definition is similar to our definition — it’s someone delivering on the ‘contract’ of the agreement. So designing for trust isn’t, as Airbnb’s TED talk says, getting strangers to trust each other via the ‘good design’ and premise of their platform; it’s actually the contractual relationship with Airbnb the company — the consistency of delivering on that agreement, the governance they’ve designed, hearing and seeing the successes of other customers and renters, and the ability to easily contact their customer service team and how helpful that team is that creates trust. Those elements are ‘trust’ in design.
Trust scaffolding
What are all of the elements that create trust? And how is trust weaponized? Those are the questions we are still answering. Does color play into trust? What about GIF support? UI popups when connectivity is lost? Should platforms let users know within the UI of data breaches, or better explain end-to-end encryption? Should private Twitter or Facebook posts have warnings about how ‘private’ those posts really are? Should Instagram posts allow users to opt out of their images being used for commercial purposes? All of those questions have design implications and all of those questions can be features; and features that relate to trust.
To create better products, we need to really focus on what transparency means in design, and that doesn’t just mean open sourcing code, it means explaining what a product is doing when and why, and offering opt in versus opt out. Earlier on I mentioned pathways and frameworks to reinforce trust: these are specific things that create trust. In essence, it’s trust scaffolding for products and those additions that make products even stronger. Users need agency to make decisions within products and they need clear and explainable options. Trust is not just telling users “we are using your data in X way,” it’s letting them have options. Trust needs to be clear, understandable, and consistent. If you have a ‘lock’ icon, or are storing data, that data needs to be properly stored and secured because that is the contract and promise of displaying that lock icon. In designing for trust, security and privacy are incredibly important, and cannot be glossed over. They need to be equal to the design and vice versa. Trust is about legibility and honesty.
If you’re interested in this project, where our outputs will have real design use cases and suggestions, please follow us on Twitter or read our tiny letter. And please, designers, take our survey