Whether you’re just starting out with user research or are a seasoned UX researcher, here is a comprehensive list of questions to inspire you and help you achieve success with your user interviews. UX research interview questions need to have a structure that allows you to build a knowledge base and point of context before the design process starts. Learning and understanding the context of a problem will give insight into what possible outcomes there are for the design.

The parts of UX Research: discover, explore, test, listen.
Parts of UX Research – Discover, Explore, Test, Listen. Image credit NN/g.

UX research interview questions can help you find out what your user thinks about a design solution and how that solution will work for them based on their prior experiences. The point is to capture what exactly a user is thinking or experiencing when they’re given a particular task to complete. 

The type of user interview questions you use will depend on the context of your testing parameters or hypothesis. First, ask yourself:

  • Are you looking to find out the user’s attitude about a design or feature?
  • Is their usage behavioral? Do users use the design or feature after someone else has done so or under limited circumstances?
  • Is usage feature related? Are users more likely to use the design or feature if it can be easily done?

Determining user expectations and impressions 

Below are some key questions to ask to gather overall feedback and determine a user’s impressions of a product. During your user research sessions, regardless of the type of product or feature, always pay attention to the user’s facial expressions and body language. You can follow up on any non-verbal communication to gain more insight to what a user was thinking or feeling while using an experience, along with their first impressions. 

User expectations and impressions questions:

  1. What are you thinking as you look at this?
  2. What is your (first) impression of this product/feature?
  3. What do you think this product/feature does or will do?
  4. Where do you start?
  5. When and where do you think someone would use this product/feature?
  6. What do you expect to gain from using this product?
  7. What would keep you from using this product?
  8. Do you feel this product is similar to another one?
  9. Do you trust this product?
  10. You [started to shake your head] when I showed you the interface, what caused this reaction?

General task-driven feedback questions:

  1. How would you go about performing [task]?
  2. What do you expect to happen if you did this [task]?
  3. What alternative method would you use to perform [task]?
  4. Was anything surprising or did not perform as expected?
  5. Was the interface easy to understand?
  6. What was the easiest task to accomplish?
  7. What was the hardest task to accomplish?

User testing specific tasks to assess interaction models

In addition to the general task-driven questions above, you can ask users to perform very specific, scripted tasks. These types of tasks test the usability and comprehension of the products interface. The purpose of scripted tasks is to find specific issues with interaction, validating how the user perceives that the interface is used and how it actually works.

User testing tasks are direct and have a specific outcome, for example:

  • Find specific information on a page or subpage.
  • Navigate to a specific subpage.
  • Login into the application
  • Go to the Dashboard and modify your user profile avatar
  • Send a photo to someone on your friends list
  • Read a message from client services
  • Record a short video and upload the video to your profile

The user will either complete the scripted tasks or fail. After each of the assigned tasks, ask the user to rank and provide feedback as they are complete, one by one.

Specific task usability questions and framework:

After each task, prompt the user to rate each task/accomplishment using a series of scaled responses. You can modify the responses/ratings to suit the type of information you are trying to learn, from the user testing sessions.

Image by Academic Scope.

Using a simple Likert Scale can help collect responses quickly and easily: 

An example of a Likert scale.

Remember to ask the user why they responded with each rating, especially if the rating is low or negative. Their answers will provide more insight to what they found easy or difficult with each task. Getting an immediate sense of why the user gave a poor rating will help diagnose specific problems with the interaction model or presentation of your design solution. Fixing a button label or changing the type of icon presented can significantly change the overall perception and experience. 

A chart showing two dimensions of questions that can be answered by user research. Behavioral and attitudinal on the y axis and qualitative and and quantitative on the x axis. 'What people do' occupies the upper center quandrant; 'Why and how to fix' occupies the left center; 'What people say' occupies the bottom center; 'How many and how much' occupies the right center.
Image credit Christian Rohrer, XD Strategy.

When users give you a statement about an aspect of the design or how they felt when using the product, ask them to expand on that idea by building on their attitudinal and behavioral responses. A why, what or how; these follow up user research questions will allow you to dig deeper into user thoughts and opinions. For example:

User response: “I thought the design was very plain.”
Follow up question: “What was it about the design that made you say it was plain?”

User response: “That is not a feature that I would use.”
Follow up question: “What was it about the feature that makes you not want to use it?”

User response: “I was not expecting it to work that way.”
Follow up question: “How would you expect it work in that situation?”

User response: “I don’t like that color of blue.”
Follow up responses: “Why don’t you like that color of blue?” or “What is it about that color of blue that makes you not like it in relation to the product?”

User response: “This is not something that I would download and use.”
Follow up responses: “Can you tell me why this application is not useful to you?”

Closing the user testing session 

You can choose to close out the session with a series of summary ux research questions; these let the user speak their mind or address design aspects of the product.

End-of-session follow up questions:

  1. Do you feel this design was made for you? Why or why not?
  2. What was the one thing you liked the most about the design?
  3. What was the one thing you disliked the most about the design?
  4. If you could change one thing about the design, what would it be?
  5. Would you download/use this product if the change(s) were made?
  6. Do you feel this is something for the desktop? Mobile? Or both?
  7. Would you recommend this to a family member or friend?

In each case, you may need further follow up questions with the user’s responses. This can drive the conversation to help find the root of the design problem and validate your design assumptions.

When in doubt, always ask “why?”