Illustration by Freddierick Mesias

Anyone who was a Mr. Robot fan will remember the scene in which Susan Jacobs, general counsel for the fictional company E-Corp, has a disturbing encounter when her smart home is hacked. From the alarm ringing, to music blaring at a deafening volume, to the shower water turning to a scalding temperature, then the air conditioning turning icy cold—it was a horrific depiction of how the Internet of Things (IoT) could be used to harass people in domestic settings.

Fictional as it is, the scenario also points to something very real: the increasing use of smart technologies in cases of domestic abuse and intimate partner violence. Technology facilitated abuse, or “tech abuse” as it’s used in shorthand, has become a broader problem,  more diffuse and harder to counteract, as smart technology has become all the more deeply entwined with daily life. There are expected to be more than 64 billion IoT devices worldwide by 2025, with the IoT home market set to reach a staggering $53.45 billion by 2022. When you combine this with the prevalence of domestic abuse, which affects one in four women at some point in her life in England and Wales alone and kills two women every week, it’s clear that urgent attention needs to be paid to how these phenomena relate to each other.

Domestic abuse has been a feminist issue for decades, with tireless activism and campaigning exposing the issue, providing shelter and support, and demanding radical change from law, medicine, and society. As tech abuse puts technology at the center of this ongoing fight, a feminist approach to design will lead to a better understanding of how technology can facilitate domestic abuse, how and when devices are likely to be used and abused, and the importance, in general, of designing with and for the most vulnerable.

Understanding the nuances of domestic abuse 

Tech abuse can be carried out through a range of smart technologies, including internet connected cameras, security systems, thermostats, voice-activated assistants, and speakers. The “smartness” of these technologies is the ability to control them remotely through a connected device like a smartphone, which gives abusers the power to monitor and harass victims. Imagine waking up in the middle of the night to find yourself in a sweltering room, or finding cameras installed throughout your home, so that your intimate partner could surveil your conversations with friends and family in your own kitchen. These are real scenarios that have been reported to shelter workers and domestic abuse organizations.

Dr. Leonie Tanczer, principal investigator at UCL’s Gender and Internet of Things project, emphasized how important it is that designers have a solid understanding of the nuances of domestic abuse when determining the design and implementation of IoT devices. She explained that there are typically three stages to domestic abuse: when a person is in the relationship, and often cohabiting with an abusive partner; when they are preparing to contact the police or move to a refuge; and when they have left and are trying to rebuild their lives. “Across all these stages, technologies carry different risks but also opportunities to victims, stretching from means of punishment or control to something more empowering,” she says.

Combating the rise of stalkerware

Alongside hacking IoT devices, tech abuse can also be carried out through stalkerware or spyware—software that enables someone to monitor activities on another user’s device without their consent. Stalkerware infections grew by 40% in 2019, according to cybersecurity company Kaspersky, with 67,500 unique users having stalkerware apps installed on their phones in 2019. In 2018, Ross Cairns was convicted of stalking his wife Catherine after he used one of these apps to login to an iPad mounted to the kitchen wall in their home, and eavesdrop on a phone conversation in which she talked about their failing marriage. Shortly after the conversation, Cairns arrived at the property and confronted his wife about what she had said.

Stalkerware apps are commercially available (you can download them through app stores in minutes) and can easily be installed so that an abuser has access to everything on a victim’s phone. In her Ted Talk, cybersecurity expert Eva Galperin notes that this kind of spying is remarkably cheap, costing around $40 per month, and that some vendors are even brazen in their marketing strategies—claiming, for example, to help people “spy on their wives with ease.” In fact, many Spyware companies are blatant about how their products will be used and how to install them. Take this description on Xnspy’s website, for example:

“If it’s Xnspy you are spying with, you will have to manually install it on the target device. Then you can spy on them, remotely, which is a completely different matter. Xnspy only takes under a minute to download on a target Android phone if you have a good internet connection. Add another minute or two for the installation. It’s normally quite easy to have access to someone’s phone for this much time. You will be done with the installation before you even know it.”


Legislation playing catch-up

As is often the case with rapidly evolving technologies, legislation and regulation struggle to keep up. The Domestic Abuse Bill was recently reintroduced to the UK Parliament, and now is expected to include more protections for victims, including a requirement for local authorities to include refuge accommodation as well as the acknowledgement to different facets of tech abuse. Dr. Tanczer has written that the lack of focus on tech abuse in the bill could have a severe impact on vulnerable groups in the future; in a recent statement, the government announced that the bill will encompass tech abuse, a promise which she hopes they will follow up on when it is passed into law. In the US, legislation is a bit further along, with Queens-based Assemblywoman Nily Rozic introducing a bill that would allow judges to issue protective orders that bar abusive partners from using internet connected devices. The European Court of Human Rights has also recognized improperly intercepting, consulting, and saving of electronic communications as an aspect of violence against women and girls.

Yet alongside changing legislation, we also need multiple stakeholders to contribute to addressing tech abuse, including IoT vendors, support services, bystanders, activists, academics, governments, the law, and law enforcement.

Crucially, designers and developers of these products have a responsibility to fully understand how they impact the lived experiences of women facing domestic abuse. Otherwise, they risk unwittingly assisting perpetrators.

Feminist approaches to designing, building and evaluating products or services are essential, because they expose the ways existing structures of oppression and inequality are amplified by technologies, and challenge inequality throughout the entire design process. As Maria Farrell, an internet policy and governance consultant, has written, feminism is a secret super-power in cyber-security: “Checking every app, data-set and shiny new use-case for how men will use it to endanger women and girls is a great way to expose novel flaws and vulnerabilities the designers almost certainly missed.”

Here are some prompts gathered from my research and conversations with experts in the field:

Design with (not ‘for’) victims and survivors

Feminist approaches encourage designing with, not for, communities. CHAYN, a global volunteer network addressing gender-based violence, is a brilliant example of this mentality: Up to 70% of its volunteers are survivors who are involved in everything from organizational strategy, project selection, content creation, and translation, to planning, testing and making UX recommendations. Survivor led-design has two major advantages. First, it embeds diverse views, values and experiences in each stage of a project, from conception to implementation. Second, it empowers survivors, giving them a sense of agency and value after having experienced a profound loss of control. In CHAYN founder Hera Hussain’s words: “It’s really important that we don’t just create things for the end product, but also the process of creation must be an empowering act itself.”

Screenshot of homepage for domestic violence volunteer network CHAYN.
The homepage of domestic violence volunteer network CHAYN, an example of survivor led-design.

Pay close attention to survivors’ lived realities 

Another CHAYN-led project, Soul Medicine, emails bite-sized information to women on topics including domestic abuse and staying safe online. As the design team was survivor led, they knew what would have helped them when they were experiencing abuse themselves. Because they engaged with these lived realities, they knew that email subject lines could alert abusers that their victims were trying to access help, putting them in even more dangerous positions. The design solution was to create emails with fake subject headers, so that abusers would not know they were trying to access help. Working with survivors, you can always ask: “What would have helped, when I was in that situation?” Without them, this is impossible to answer.

Screenshot of homepage for Soul Medicine
Home page of Soul Medicine, a digital service designed to deliver critical information to women experiencing abuse, especially migrant and refugee women.

Understand the shared device ecosystem

Often perpetrators own account details and hold administrative rights to devices. They may have bought a victim’s phone for them, know or be able to guess their password, or have access to children’s phones. Designers need to understand the shared device ecosystem, and design devices to be far more transparent about who has access to them. Notifications could be enabled to remind device owners who has access, and then any unauthorized users could be removed.

However, this is another feature that can be misused to harm a victim. For example, if Google Assistant prompted a victim to check who has access to Nest, they could see if an abuser had gained access without their permission and remove them. But a perpetrator could also use this feature to deny access to a victim. Designers, and in particular communication designers, can help educate users on the risks involved in sharing devices.

Don’t build spyware 

Just don’t. As Hera put it: “Stalkerware shouldn’t exist at all, because it’s unethical.”

If you are worried about tech abuse, there are some great resources online:

Gender and IoT (G-IoT) Resource List: a list of organizations that produce guidelines and advice

eSafetyWomen: information to help manage online safety issues

Too Into You: a quiz to discover the signs of dating abuse

Computer Security And Privacy for Survivors of Intimate Partner Violence: security and privacy resources for managing tech abuse

How “stalkerware” apps are letting abusive partners spy on their victims: tips on what to do if you’re worried about stalkerware

The Good Friend Guide: tips for how to be supportive to a friend or family member experiencing abuse