Technology is evolving at a dizzying speed. The world wide web is just 30 years old but it has changed massively over the last three decades. Now we all carry powerful computers in our pockets, which means that the way we design has transformed, too. We started by designing standalone websites, then we designed for the desktop first and created separate experiences for mobile — responsive design put an end to that.

New tech is reshaping the way we live and work, and companies need to pay attention to the latest developments through new technology articles to continue to engage with their customers and stay relevant. In the 1990s massive brands such as photography pioneer Kodak struggled with the shift to digital because they just didn’t anticipate the enormous effect the new technology would have.

Don’t be Kodak – invest in learning about emerging technologies now. The form factors you design for, like the smartphone, will likely no longer exist in a few years. To stay ahead of the game, understand the capabilities emerging tech offers us to interact in new ways and how UX design needs to adapt to accommodate the latest technological advancements. If you don’t, the market may simply make you obsolete.

In this article, we’ll take a look at four of the currently most significant types of emerging tech UX designers need to know about. Before we do so, however, let’s step back in time and explore how design has evolved. Because in order to understand the future, we must look at the past.

A brief history of UX design

Design has always been an integral part of product development, whether it’s cars, homes, or computers. As technology evolved, brands began to try and make the design of their products fit the times — or they’d risk being left behind. But it’s the invention of the internet that has really sped up the evolution of technology. It’s behind the majority of emerging tech we see today and the reason we are now at the cusp of another big change in how we relate to technology.

The introduction of the iPhone (and all the smartphones and tablets that followed) was a key moment in design’s role in technology. Suddenly, designers needed to take into account all kinds of screen sizes and use cases, as mobile devices became the primary computers for many people. It also required designers to think about a completely new type of interaction — touch. And it meant that there was more data than ever before that could be analyzed.

The internet became the biggest disruptor, but as companies adapted to emerging technology, they have increasingly done so with a focus on user-centered design. The term was coined by usability legend Don Norman in the 1980s, and his book “The Design of Everyday Things” remains hugely influential to UX designers today. While the context is ever-shifting in the face of new technologies, the goal remains the same: to deliver seamless and enjoyable user experiences. Always look at emerging technologies through the eyes of the user and focus on solving their problems first.

These days companies hire designers en masse — turns out they have the skill to translate complex new technology into essential interactions. Designers help create the best possible user experience, reduce friction, and give brands a competitive advantage. As tech evolves, however, so does the designer’s job. In the following section, we’ll examine some of the most exciting technologies right now and the key considerations UX designers need to make to get started.

Types of emerging tech: The next frontier

Voice UI

The next natural step in user interface design is the disappearance of the screen (there’s even a book called “The Best Interface is No Interface”). And so voice assistants such as Amazon Alexa, Google Home, Apple’s Siri and Microsoft’s Cortana have become hugely popular over the last couple of years. There are already around 3.25 billion digital voice assistants being used in devices around the world, and forecasts predict that by 2023 this will reach eight billion — a number higher than the world’s population!

Voice UIs have the potential to make the user experience more enjoyable and simpler. They enable people to multitask and form a more human relationship with technology. As there’s usually no screen, though, the process differs significantly from designing graphical UIs, and we need to understand voice communication and how to design a conversation in order to create effective voice UIs.

Let’s take a brief look at the first steps:

Research

So far, so good. Just as we’re used to in other types of product design, a successful voice UI starts with plenty of user research to really understand the users’ needs, behaviors, and struggles. Identify your target audience and analyse how they currently perform tasks and find the information they need. Uncover what they might struggle with and where a voice experience might come in handy. Pay special attention to the user’s language in this step and the terms and phrases they use. You’ll need this later on when you design the conversation.

Defining

Next, define your product and its capabilities. In this step you need to explore the scenarios someone might want to use your voice UI. Write down different use cases and order them by the importance and value to your target user. Then make sure your key scenarios will actually work with voice. Remember the real reason to use voice is that the user can complete the task easier and more effectively. If that’s not the case, another interaction may be more suitable.

Create

Finally, create storyboards of potential situations that may require a voice UI. Then write dialogues, the foundation of the user flows, and keep them natural and brief. Reflect your unique brand and identity in the dialogue but craft the tone of voice and personality of your voice assistant carefully. Always consider your users’ needs and think about how staff should ideally interact with your customers, using empathy wherever you can.

Once you’ve designed your voice UI, test and prototype it and continue to refine it. There’s no denying that voice UIs — when done right — can give you a big advantage over the competition.

Amazon Alexa integration for Adobe XD lets you preview voice prototypes on Alexa-enabled devices.

Touchless gesture control

Touch – introduced by the iPhone in 2007 — revolutionized the way we design but now interaction design is evolving even further. Companies are starting to invest in touchless gesture control, which works by tracking the motion of a person’s hand.

Soon you won’t even need to touch your phone, you’ll be able to just wave your hand or pinch your fingers in the air to carry out a task. While Samsung requires you to hold a tiny wand – equipped with sensors – that you can also use as a stylus, Google’s Project Soli uses radar technology that works with a tiny chip that’s crammed full of sensors. These interpret the motion and translate it into data, determining what the user is trying to do.

For now you’ll only be able to snooze alarms, skip songs, and silence phone calls with gestures on Google’s flagship phone, the Pixel 4 (see video below), but it may pave the wave for a touchless future. Other use cases are gaming or driving, as evidenced by BMW’s Airtouch Dashboard (debuted at CES 2016), which uses sensors to pick up hand gestures, enabling drivers to adjust the volume or accept phone calls.

To be able to design effective gesture-based interfaces, designers will need to study human ergonomics and motions, for example how a user would naturally move their hand.

Artificial Intelligence (AI)

AI technology not only powers voice UIs, it’s also in the many chatbots we’ve seen pop up in recent years. Progress in natural language processing especially — which deals with interpreting what a user says and understands what they want to do — has led to the creation of much more user-friendly experiences. Conversations with machines are becoming more natural, and we increasingly leave it to them to predict our next move. This is where anticipatory design comes in. It has the power to transform experiences by making smart suggestions and decisions instead of presenting us with a range of choices.

There have been some incredible advances in artificial intelligence and machine learning, and AI is increasingly making its way into mainstream product design. Facial recognition software, for example, is getting more mature and soon the technology could even extend to emotions. We can also make use of more contextual data than ever before.

As AI and machine learning becomes more powerful, we also need to keep its limitations in mind and consider ethics. A lot of algorithms are being fed with biased data, which has resulted in flawed and discriminatory experiences (for example, facial recognition software that only works for white people or the over reliance on female voices in virtual assistants). No matter what technology you’re exploring, always ensure harm reduction is part of your process.

Last year, Adobe previewed an AI feature called About Face that detects when a photo has been manipulated.

Virtual reality 

Another emerging tech with the potential to change our world is virtual reality (VR), which is already being used in gaming, education, travel, workplace productivity, and healthcare. It also opens up opportunities for innovative storytelling and e-commerce as you could try out products before you actually see and touch them in real life.

VR is transforming the way designers work, as they no longer need to create life-size models of their designs but can use VR technology to preview and prototype them before taking them further. It also helps designers put themselves in the shoes of the user more than ever before. Google Cardboard app “A Walk Through Dementia”, for example, enables you to perceive everyday life through the lens of somebody with dementia, which can significantly improve designers’ empathy and help build more inclusive products.

As VR completely immerses the user into a new environment and usually requires headsets such as the Oculus Rift, it’s important to remember that this technology has a very different set of requirements than UX designers may be used to.

To get started, consider the users’ safety, ensure they remain comfortable during the experience, and prevent motion sickness. Take into account diverse user ergonomics and capabilities and don’t inadvertently exclude people. For more tips, see Virtual Reality (VR) Design & User Experience.

In this talk, web animation expert and design evangelist at Adobe, Val Head, explores how the role of designers evolves as we move away from screens towards technologies like VR and AI.

What does this mean for the future?

Whether you’re exploring voice, chatbots, AI and machine learning, VR/AR, gesture recognition, or a completely new type of emerging tech, you will need to adapt. New technologies pose both an opportunity and a challenge. There are many unknowns, and best practices, standards, and patterns will still need to be figured out.

The good news is that design always plays an important part in creating engaging and intuitive experiences — just the technology changes. It impacts your work but, as a UX designer, you can fall back on your existing skill set. Employ user-centered design techniques to conduct extensive user research, add delight and personality to an experience, design with inclusivity in mind, and implement harm reduction in the process right from the start.

There are fresh challenges ahead, but you are in the unique position to step out of your comfort zone and play around with emerging tech early on to push boundaries. Just experiment, learn from your mistakes, and share what you’ve discovered with the community. Maybe you’ll even contribute to a fundamental shift in the way we interact with technology — how exciting is that?