“Designed to be deleted” dating app Hinge is on a quest to get to know me on a level that other apps have not done so far. I am a South-Asian woman, with a dating preference for South-Asian men. Within a few days of actively using Hinge, the only profiles on my Discover page were those of South-Asian men, as if the app had become a matrimonial site for brown folks.
On the surface, this is the opposite of a problem. Personalization and machine learning have enhanced my user experience by showing me profiles of people who I’m more likely to match with. But these new technologies also raise a series of questions: What other snippets of my data are apps like Hinge analyzing? How are they analyzing it? And how does it show up in the way I interact with the app?
In response to some of the ethical dilemmas surrounding the UX of major dating apps today, a new wave of conscious, responsible alternatives are appearing. These apps take a new approach to the design of dating, whether rethinking how an app stores its data, reconsidering the addictive nature of platforms, setting out to keep minorities safe, or reimagining what a spontaneous interaction might look like in digital space.
800 pages of secrets
In 2017, journalist Judith Duportail asked Tinder to share the data they held on her, and the company responded with 800 pages of information. Some of the data points highlighted in the report seem wholly unnecessary to finding her a romantic match —“Facebook ‘likes’ [and] how many Facebook friends [she] had” were among them—but perhaps the most shocking were the reams of pages that revealed every conversation she’d had on the app. Bumble, which works on a similar swipe model to Tinder, goes one step further and records “mobile fingerprints,” such as swipe patterns and response time to messages, which help the app’s algorithm understand your behavior and further hone its suggestions to you. Hinge, which like Tinder is part of the umbrella dating conglomerate Match Group, is “building features to help learn from users’ offline experiences so the team can go deeper and test more theories.”
It’s no surprise that free dating apps are gathering information from the moment users log on and begin swiping. Like other social media platforms, dating apps view information as currency, and it’s becoming increasingly clear just how common it is for the most-used dating apps to track user behavior and exploit it for marketing purposes. Receiving targeted advertising according to one’s relationship status has an eerie undertone, and data giant Facebook openly offers demographic targeting services to businesses based on users’ relationship status. Speaking to Consumer Reports, Jeff Greenfield, the co-founder of advertising attribution firm C3 Metrics, highlights how relationship status data can be used to plug weight-loss programs to singles. A sinister potentiality arises when we think about how long some of the apps retain the data; Muzmatch, a dating app for those of Muslim faith, keeps records for three years before permanent deletion.
The question really is: Why are these companies storing data at all? 800 pages of Duportail’s most nuanced interactions with the Tinder app gives an overly-detailed snapshot of the kind of data points dating apps are using to enhance user experience, but it simultaneously reveals the amount of data that is harvested for seemingly nothing but a rainy day.
Pickable, developed by Clementine Lalande, deliberately limits the amount of data it collects about its users, with privacy and security being a major cornerstone in the design and development of the app. The anonymity that is awarded to women on Pickable—who do not need to provide personal information such as a name, picture, or age—addresses major concerns of how a users’ vulnerable and sensitive data is used, whilst also allowing them to have a positive and flirty experience with the app.
Do bots dream of intimacy?
The overarching narrative of romance is one of chance meeting. That narrative disappears in the realm of dating apps, where love and intimacy become codified. Dil Mil, a dating app made exclusively for the global South-Asian diaspora, allows you to pick from a pool of Personality Traits, that are then added as tags to your profile. You can tag yourself as “Family-Oriented,” or, if you’re feeling particularly brave, “Wife Material.” Tags that are common with fellow users are then highlighted as you swipe through. As I scroll through hundreds of labels on Dil Mil, I’m struck by how overwhelmingly positive most of them are, a problem that can be attributed to dating apps in general. A highly curated version of the user is presented on these apps, when, one could argue, true intimacy comes from recognizing and embracing flaws, both in yourself and your partner.
Most people who use dating apps are usually on multiple platforms, which underscores how online dating has become a numbers game; it’s easy to forget that there are real people on the other side. On Coffee Meets Bagel, an app promising “meaningful” and “authentic” connections, the UX seems quite the opposite: Users gather coffee beans as part of a points system that gets spent every time you like, pass, or want more information about a potential match. This functionality arbitrarily enforces a limit on how much effort you can put into finding said “authentic” connection, the irony being that you actually have to pay real money if you wish to continue trying past your allocated limit.
Users can ‘pause’ their swiping, as if it is possible to put a hold on human connection; dating apps like Coffee Meets Bagel echo video games in their UX, projecting images of an alter-ego in the dating game setting that is put on-hold for the ‘real’ you to recalibrate. The addictive dopamine effect that comes with finding a match explains why users have multiple apps: when you run out of your daily quota swipes on one, simply open another.
Entrepreneur Mohil Sheth—who is launching his own AI and data-driven dating app, Onely, in June 2020—feels that despite subscribing to a number of the big players in the market, “None of [the apps] have really been able to solve the true spark of making new connections,” citing the age-old example of the spontaneity that comes with walking into a bar and catching somebody’s eye. Sheth’s app will sync with Spotify or Apple Music, using music choices and listening history as data points to build connection between users, with algorithms classifying the music into over one hundred different ‘mood types.’ “Most have experienced coming across profiles that [have] fake information”, says Sheth, “Data using music is really hard to fake. You wouldn’t listen to songs you don’t like.”
The opacity of alogrithms
Dating apps use data to fine-tune who they surface as a potential match, but that convenience is a double-edged sword. While it means you might find a match faster and easier, it also can lead to algorithms that perpetuate biases and limit the pool of potential partners. Dating apps are designed to learn preferences and feed you more of the same based on what’s worked in the past—this weighting towards past predilections becomes a lot more complicated when racial and ethnic preferences are thrown into the mix. In 2016 the app Coffee Meets Bagel received criticism for its algorithm that weighted matches based on similar race, despite users stating they had no racial preference.
And while apps like Tinder and Bumble claim they don’t collect data about users’ ethnicity or race or use similar information to inform their algorithms, the opaque nature of algorithmic matching can reinforce conscious or unconscious bias. “When the screening process is automated, users may be unable to determine precisely how their matches were selected, or why others were deemed incompatible and thus made invisible,” researchers from Cornell University wrote in a 2018 paper called “Debiasing Desire: Addressing Bias & Discrimination on Intimate Platforms.” “Users may assume that their stated preferences had some impact on their outcomes, but the use of aggregate user preferences to make match predictions can make the logic behind intimate matches difficult to understand.”
For those identifying as LGBTQ+, some of the data presented on the apps as part of their profiles is sensitive. Providing geolocation data might pose a risk to those who have yet to come out—and yet, Grindr’s location functionality is accurate within a kilometre in certain locations. Most apps enforce a male/female gender binary, a two-pronged data point that allows for simple analysis, and conveniently ignores the gender spectrum in favor of the biological dichotomy. Thankfully, several new dating apps have recently appeared that promise to keep minorities safe.
Lex, a dating app for the LGBTQ+ community, is based on the original dating platform: Personal ads in newspapers. The text-first approach offers a simple revolution in online dating: You don’t see pictures of potential matches first, allowing a connection to build on communication alone.
Similarly, Pickable and Blindlee allow women to control the amount of information revealed on their profiles. Pickable offers to hide certain personal details such as age, while Blindlee includes a functionality that blurs a female user’s face during short video calls with potential matches. The design not only empowers women who may otherwise be afraid of exploitation and abuse on dating apps, but also, like Lex, takes a text-first approach to online dating. Users communicate in the written form first, then having the choice to reveal personal details and photographs of themselves.
Finally, although rarely spoken about, straight, cisgendered women are not the only victims of predatory behaviour online, which BARE Dating recognizes. BARE allows all users, regardless of gender identity, to hide and reveal elements of their profile, and also requires uploading a form of government-issued identification in order to minimize hacking and scamming risk.
The new rules for designing
Given all the complicated nuances around designing apps, how can designers go forward and protect users, limit discrimination, and deliver on its promise of finding intimate connections? Rena Bivens and Anna Shah Hoque, academics in the fields of Communications and Gender Studies respectively, suggest that design inherently becomes a true reflection of the development team’s beliefs in their paper “Programming Sex, Gender, and Sexuality: Infrastructural Failures in ‘Feminist’ Dating App Bumble”—
…technological design [is] a social and political act that is both influenced by surrounding sociocultural and political-economic contexts and actively involved in constructing such contexts.
Rena Bivens and Anna Shah Hoque, academics in Communications and Gender Studies
In the context of dating apps, a predetermined idea of love existing only between a man and a woman ignores both same-sex relationships and the gender spectrum. The notion that intimacy can be robotized by matching interests and personality traits implies a lack of understanding when it comes to the complexity of human emotion. Holding, sharing, and analysing sensitive data points, particularly of minorities, suggests a fundamental lack of awareness of structural vulnerabilities.
Responsible design of dating apps could include educating design teams on unconscious bias, or ensuring that design teams themselves are suitably diverse in terms of gender, race, cultural background, and a variety of other factors. Responsible design could also educate users themselves, by adding alerts to ensure users are mindful about their data footprints. In practice, this could be as simple as a reminder to be digitally savvy and safe every time a user starts a chat with a new person, or introducing algorithms that track when an individual is at risk of providing sensitive personal information to a match.
Responsible matching of users could include data points that help intimacy flourish, by asking users what they look for in potential partners, and what they believe their own personal flaws to be. Consider the message that your app and its functionality are communicating. For me, although apps like Pickable and Blindlee are taking positive steps in the right direction, a feature that allows women to hide themselves in order to feel safe, and then slowly unveil for their male counterparts, does little to solve the problem of online sexual assault. It also carries a disturbing echo of sixteenth century tales of sacred virginity.
As Luke Stark, a digital technology specialist at Dartmouth University, says, “We can’t feel data.” Its intangibility, combined with its behemoth scale, corners us puny humans into a position of inertia. But users must become increasingly critical, and protective of our rights as data subjects and consumers. Because nestled in dating app Bumble’s Terms and Conditions lie the following words: “You agree that you will not file or participate in a class action against us.”