Creating intuitive and engaging experiences for users is a critical goal for companies large and small. To achieve this, modern product design is driven by a cycle of ideation, prototyping, and user testing. Product teams are always looking for ways to improve this cycle, and, for their efforts, the design space has witnessed a tremendous transformation over the last few years. The rise of design tools for wireframing and prototyping have made it easier than ever for designers to validate hypotheses.

But despite these evolutions, even the most advanced prototyping tool still requires a significant amount of time and effort from product teams. Designers are always looking for tools and technologies that will help them work more efficiently. Enter: artificial intelligence (AI).

The next generation of AI systems promises to streamline the wireframing process and is designed to help designers turn ideas from the drawing board into actual products—almost instantaneously. Let’s explore how they will do it.

Today’s design workflow

First, before we look at how AI is changing wireframing, we need to understand the current design workflow, a series of steps that designers and developers follow to create functional products. It’s possible to identify the steps in the product design process that most teams follow:

1 – Research and analysis. The product team conducts user research and creates product requirements in the format of the specification.

2 – Brainstorming and ideation. Designers analyze product requirements, conduct brainstorming sessions, sketch out ideas, and discuss them with team members and stakeholders. They turn some of these ideas into low-fidelity wireframes.

3 – Re-creating layouts from sketch to digital. Depending on the phase of the product design process and available resources, it might take minutes, hours, or even days to turn the low-fidelity design artifacts into digital mocks.

4 – Prototyping from digital layouts. Though many application design tools have built-in features that allow mocks to be turned into prototypes, designers have to invest time in organizing the content on each screen accurately.

5 – Design validation. Users test the design to reveal usability flaws, and designers take that information to introduce changes in the proposed design.

6 – Coding the design. The engineering team steps in and turns the prototype into a real product.

As you can see, the workflow is a multi-step process. It typically takes several weeks and cross-team collaboration to turn an idea into a concept.

Teaching machines to understand user interfaces

As seen in today’s design workflow, there are two steps between the ideation of a product and testing it with users: re-creating layouts from sketch to digital and prototyping from digital layouts. Designers are looking for ways to automate those steps. 

Some steps in the design workflow are redundant, requiring time designers could use to focus more on creative tasks.
Some steps in the design workflow are redundant, requiring time designers could use to focus more on creative tasks. Image credit Tony Beltramelli.

When it comes to using automation to streamline product design, it’s natural to start with wireframing because it is an intuitive method of expressing a concept. Currently, designers spend a significant amount of time turning wireframe sketches into design layouts. But what if the machine would be able to do it automatically? Wireframing AI (using computer vision and machine learning algorithms) can teach machines to understand sketches and turn them into user interfaces automatically. Using this technology, it’s possible to skip a few steps in the product development lifecycle and instantly translate sketches into a finished design.

To get a sense of the future of wireframing, check out this five-minute demonstration from Uizard on how to turn paper-based sketches into prototypes using native mobile design:

This demo shows how to teach machines to understand user interface. Video credit Uizard.

Turning visual design into code

Once designers finalize the look of the UI, they ship their work to a front-end developer to get it implemented into code. This stage of turning visual design into code can also introduce friction into the overall workflow because something can get lost in translation as developers code what already exists graphically. AI can offer a front-end coding opportunity for designers and streamline this step of the cycle.

For example, a technology called pix2code promises to transform digital sketches created by designers into computer code. The outcome of using pix2code technology mainly depends on sample size. Companies using pix2code technology can train the AI datasets with thousands of pairs of hand-drawn wireframe sketches and their code equivalents. The results are really impressive. Uizard used this technology for its model and is able to generate code targeting three different platforms (iOS, Android, and web-based technologies) from a single input image with over 77% accuracy.

Uizard is not the only company working in this direction. Below is a video of a similar concept: Airbnb Sketching Interfaces. The wireframing automation technology that Airbnb created can live-code prototypes from whiteboard drawings. Since the Airbnb team standardized the components in its design system, the app matches the sketches to a standard UI component design when proceeding with the visual design.

Airbnb’s AI system recognizes standard hand-drawn elements and automatically renders them into the source code.
Airbnb’s AI system recognizes standard hand-drawn elements and automatically renders them into the source code. Video credit Airbnb

This Airbnb example demonstrates that all of the efforts invested in creating design systems soon will pay off. AI-based tools will use the information from the design systems and automatically classify a sketch into the defined components and styles. As a result, it will be possible to generate user interfaces that follow a company’s UI design guideline. It also means that the outcome will largely depend on how carefully defined the components and guidelines in the design system are.

Improving design exploration with wireframing AI

AI can be used not only as a tool for turning raw drawings into high-fidelity mock-ups or prototypes, but also as a tool for exploring the ideas themselves. Deep learning algorithms can take input parameters (what designers want to create) and propose multiple variations of a design according to the parameters. By using wireframing AI, designers will be able to specify a general direction of the design; machines will explore all possible options for them and then propose the best design solutions based on the original requirements. (The article “AI: Snap a UI mockup & finish the prototype in seconds” by AI writer Jonathan Hui can offer a sense of how this technology works.)

AI generates different layouts from the same UI mockup.
AI generates different layouts from the same UI mockup. Image credit Jonathan Hui.

Using AI-based content generators

The automation of design tasks is not limited to auto-generated layouts. When designers work on wireframes, they typically use content placeholders instead of real content. Technologies like a generative adversarial network (GAN) can be very helpful for generating content that will replace generic placeholders with real content so that the outcome will resemble real products.

Contemporary projects like This Person Does Not Exist, a GAN that generates facial images of fake people based on data from images of real people, already save designers a lot of time that would have otherwise been spent locating a relevant piece of content for their design. Wireframing automation will no doubt start to introduce tools like This Person Does Not Exist in their feature sets, so designers won’t need to leave their wireframing work in search of placeholder content.

Will AI replace UI designers and front-end developers?

In short, the idea of fully replacing a designer or a developer with an algorithm sounds futuristic, but the fear of AI taking over those roles is unwarranted. Product design is a complex process with a lot of assumptions and unknown territories. When designers work on a new project, they make a lot of big and small decisions; many design decisions are based on personal taste and a designer’s understanding of requirements. Quite often, design requirements cannot be provided in the format of the clear specification. As a result, it’s impossible to expect that a machine will take something abstract and turn it into a great design.

So, when we talk about using automation in design, what we mean is a creative collaboration between humans and machines. The true power of AI-based systems is the synergy of the two—machines will help designers work more efficiently with their UX tools. AI deep learning is good at extracting a million patterns and turning them into design decisions, but it’s up to designers to select the one that they think is best. Accordingly, the rise of AI-driven tools will help us reimagine the role of designers, who will be freed up to invest more time in requirement gathering and exploration.


AI-assisted wireframing will become an integral feature in future wireframe tools. Wireframing automation will help designers to focus on testing functional design artifacts, rather than investing time in polishing pixels. AI tools will create an exoskeleton for designers, increasing their productivity and speed of making decisions.