When we envision the future, it always seems so efficient. Smart technologies will make our lives better and more productive. This is a rational view of the technology, but humans are not entirely rational. We are also emotional beings, who create emotional bonds with technology.

When it comes to emotions, our modern day machines understand quite a lot, but unfortunately they don’t understand how we feel. That’s why we often feel frustrated when interacting with digital products. It’s not necessarily because they don’t help us solve our problems but because we believe they don’t truly understand us. However, that’s about to change.

Recent technological progress has made it possible to incorporate emotional recognition into our products. Many emotionally-intelligent experiences, which have felt like science fiction for a long time, can finally be programmed. It’s already clear that artificial intelligence, paired with emotionally-intelligent technology, will shape the mobile experience of the future.

What is emotional intelligence?

Emotional intelligence is the ability to recognize, understand, and act on the emotions of ourselves and others. Emotional intelligence is a fundamental skill that allows us to understand each other and navigate our world. Understanding emotions applies to almost everything we do, from ordering a taxi to interacting with people we love.

Why are emotions important for business?

We already know that design can have a deep emotional impact. It’s clear why emotions are essential for users, but what about businesses? Do emotions have a commercial value? The answer is yes! Everyone working in marketing or advertising already knows that emotion sells. Emotions have an opportunity to create a long-term relationship with users.

How to achieve emotional intelligence on mobile devices

Modern mobile devices are sophisticated computers that are able to take inputs from multiple sources such as audio, a camera, or physical sensors and use pattern recognition to detect emotion. Emotionally-intelligent mobile user experience can be split into 3 key areas.

  • Natural language processing.
  • Facial expressions recognition.
  • Analysis of physical signals.

1. Natural language processing

Since the early days of machine-human interaction, we have used an augmentation to communicate our intents to machine — first it was punch cards, next it was command line, and now we have graphical user interfaces. But we have always wanted to be able to communicate more naturally with technology.

Recent progress in natural language process has made possible to create a new type of interface — conversational interfaces. Today we have two major types of such interfaces available on the market — text chatbots and voice user interfaces.

Chatbots — bake conversation and emotion together

Chatbot apps have existed ever since the first days of interactive computing. Joseph Weizenbaum created one of the first chatbots known as ELIZA in the mid-1960s and this chatbot used natural language processing to simulate conversation. For a long time, chatbots were fun to play with but didn’t have real practical value for users — they weren’t capable of solving a real user problem. Recent progress in natural language processing, however, has made it possible to use chatbots for regular user tasks. A new generation of chatbot is driven by deep learning — a sophisticated version of machine learning, known as artificial neural networks, which is used to recognize patterns in speech.

One great example of a chatbot that was built using deep learning technology is called Xiaoice. Xiaoice is an advanced natural language chatbot developed by Microsoft and is available on the Chinese market. This chatbot is currently used by 40 million people. The app uses sentiment analysis and can adapt its phrasing and responses based on positive or negative cues from its human counterparts. It also remembers details from previous conversations with users and uses this information for later conversations (e.g. to ask follow-up questions).

 When users interact with Xiaoice, it communicates back with independent opinions and suggestions that were not being pre-programmed. The ability to learn makes it more engaging and exciting
When users interact with Xiaoice, it communicates back with independent opinions and suggestions that were not being pre-programmed. The ability to learn makes it more engaging and exciting. Image by Digitaltrends

There’s one thing that makes this chatbot stand out from the crowd — people don’t think of it as a machine, but rather as a friend or therapist. They are willing to confide in the bot just as they do with human friends. People often turn to Xiaoice when they have a broken heart, have lost a job, or have been feeling down.

Xiaoice recognizes emotions from the text. The chatbot can respond with empathy and sensitivity. Gif by EJ Hassenfratz.

Voice analysis — understanding what a user is feeling

When we think of emotional interactions with technology, we often think of conversation — and the most natural form of conversation is by using your voice. Ever since Siri was released a few years ago, voice has quickly become a man-machine interface of choice to deliver utility. Voice allows users to better interact with personal devices and apps — we don’t need to learn specific commands, rather we interact with the app or device using the most natural form of interaction for us.

Voice can also help app creators understand how a user feels — emotion-sensing technology can analyze a user’s vocal intonations and use it to understand their current mood. We already have a few apps that allow emotional recognition from voice. One of them is Moodies Emotions Analytics created by Beyond Verbal, the company that specializes in extracting the meanings behind different tones of voice. The app can extract, decode, and measure “a full spectrum of human emotions” in real-time just by listening to a person talk. This app has real practical value by better understanding people’s mood and tailoring interactions to it.

 Screenshot of mobile app Moodies Emotions Analytics that registers a person's emotional state after they touch the screen.

Voice user interfaces — creating better engagement with voice

The voice we use in our products has a direct impact on how users perceive them. People are well aware that digital products don’t have feelings and yet they prefer responses that feel warm and human, rather than cold and robotic. Products that sound like a human are able to create a better connection with users. A recent study conducted by JWT Intelligence found that 37 percent of voice technology users “love their voice assistant so much that they wish it were a real person.” Almost one-third of those who participated in the study confessed to having fantasies about their voice interaction system.

This is happening simply because, as human beings, we connect a voice to a persona when we hear it. Voice is a natural part of a persona, and it shapes its identity.

Science fiction stories, like the movie Her, where conversation and emotion are bound together, will soon be real.

2. Facial expression recognition

Facial expressions are getting a lot of attention in the world of technology right now. The reason why is simple — facial expressions are some of the most powerful tools people have to express themselves. It’s built into our DNA — we transmit and decode emotions using facial expressions. In fact, we can transmit more data with our expressions than with our voice.

As a technology, facial recognition is not new. It is already being used by social networks to help automatically detect your friends and family in photographs (e.g. DeepFace, created by Facebook). This technology can be beneficial if we want to create emotionally-intelligent design.

A better way to express our own emotions

Communicating emotion is a very natural thing for people — we have a lot of feelings and we want to share them with other people. We are also likely to share our emotions using technology.

One of the reasons why Facebook decided to extend the ‘Like’ option is because it wanted to let users share their emotions explicitly. Today, Facebook allows us to express a range of emotional responses such as laughter, surprise, anger, or sadness. Facebook’s reactions don’t just give people a way to express more emotional range, they give designers (and marketers) emotional data, too.

 Facebook's emoji reactions to posts.
Facebook’s reactions.

But Facebook wants to give people even more ways to express emotions. A patent filed by Facebook to the United States Patent and Trademark Office reveals that the company may be trying to make user’s faces the new emoji. Using facial recognition technology, Facebook wants to sort through users’ tagged photos to find faces that best match the emoji a user wants to use.

Facebook isn’t alone in this trend. Snapchat and Apple now use facial recognition tools to allow users express more emotions.

  • Snapchat successfully integrated facial recognition technology for sharing emotions.
 Snapchat uses facial recognition tools to allow users to insert cartoon emotions over their face.
Snapchat uses facial recognition technology to allow you to create animated effects.
  • During WWDC 2017, Apple demonstrated the iPhone X. Among its many features, it has one that was especially memorable for people: animoji.
 Apple demonstrated the new animoji feature that uses facial recognition to turn your face into an emoji.
The new Animoji feature allows you to use a facial recognition tool to turn your face into emoji.

By following this trend, many apps will start to add an emotional layer to their core features.

Emotion recognition and personalization

We all know how deeply our online experience is personalized. Search engines, news sites, and social networks have become quite smart in giving us what we want. However, when modern personalization algorithms analyze us, they consider only a fraction of personal information — our behaviors. The information we search for, the photos we like, the videos we watch, and our comments and ratings shape our online experiences. So far, apps and sites haven’t been considering how we feel, and that’s about to change.

The next level of personalization will rely on both behavior and emotions. Mobile apps will understand how you’re feeling based on your facial expression and adapt accordingly. Here are a couple of examples how emotionally-intelligent technology can offer recommendations based on our emotions.

  • Rather than recommendations based on previous purchases, e-commerce sites like Amazon can provide recommendations based on aggregated data from emotion-sensing technology, including user’s individual emotional reactions to products.
  • Rather than providing similar content for all users, emotionally-intelligent news apps can observe users as they read articles and track facial expressions to understand what they like and what they don’t to deliver more personalized content. For example, if users aren’t feeling engaged, the news app can provide more visual content, like images or videos, rather than text.

What’s great about facial recognition is that technology can detect fleeting expressions that humans may not consciously register or might miss otherwise. This means that mobile devices have the opportunity to understand human emotions better than real humans.

 Facial recognition software detects fleeting expressions that humans may not consciously register.
Face tracking and expression analysis in mobile apps. Image by Facebook (FacioMetrics).

Of course, there are questions about the privacy implications of emotion-sensing technology. Not all users will be happy to share their emotions. Designers will need to establish ethical design practices for this in the near future.

3. Analysis of physical signals

Our bodies can tell us a lot about our own emotions. Emotion-sensing technology can detect signals that help us better understand what exactly is going on inside of us.

Revealing our own emotions to ourselves.

Tracking our physical and mental health has become more common, with many of us using wearable devices for that purpose. Back in 2015, research by the NPD Group estimated that one in ten people own a fitness tracker. These tiny pieces of technology, closely connected to our body, have the opportunity to become devices that help us better understand our emotions. Devices can track our heart rate or skin temperature, recognize those signals, and detect emotions.

Devices can be used not only to track emotions, they can also provide real-time coaching to help users achieve emotional well-being. For example, when a device understands that a person is stressed, it can provide information on how to cope.

There are a few devices on the market that are focused on tracking emotions. Feel wristband and Spire use sensors that read your pulse, blood pressure, and skin temperature to detect emotions.

 Some apps are focused on tracking emotions. Spire uses sensors that read your pulse and blood pressure to detect emotions.
Spire notifies you about sudden emotional changes and suggests the appropriate actions. Image by Spire.IO.

Developing recommendations based on our emotions

Technology can help us better understand the people we interact with. Pplkpr is an example of an emotionally-intelligent recommendation tool. The tool collects physiological data through a wristband to tell you which friends and colleagues are better for your mental health.

Pplkpr tracks your physical and emotional responses while you hang out with your friends. It analyzes the data to identify who stresses you out, makes you sad, happy, or excited.

What we need in order to design for emotions

Having more data about people’s emotions doesn’t automatically lead to a more human experience. Even the best algorithms and latest technologies can only go so far without a designer.

Emotionally-intelligent design brings a few challenges for designers and perhaps the most significant one is changing the way we think about design. Currently, a key goal for designers is to create an experience that allows users to achieve their goal with the least amount of effort. Design is about efficiency first.

But designing for emotions requires a different approach. Crafting products that collect data, mapping it to emotions, and then conveying or evoking emotion in various ways means developing a greater overall sensitivity to emotion. The next generation of emotion-sensing technology will challenge designers to get in touch with emotions, both positive and negative. To create empathic user experiences, it will be important to know users better — what they feel when they use products and why they feel so — and respect their emotions in design.

Want to learn more about emotional intelligence?

If you want to learn more about emotional intelligence consider watching the following videos.

 

Conclusion.

Whether we are designers, developers, or product managers, we are the people who are responsible for what the future would look like. Our goal should be to enable our products to understand users’ well-being, allowing them to interact with products on emotional levels. It’s time for a new, empathetic kind of interaction.