Book a demo

Get Early Access

Read more

Written by

Kene Anoliefo


Published on

May 5, 2024

The Quick and Dirty Guide to User Research: How to do Usability Testing to Validate a User Experience

Learn how to do usability testing to validate a new user experience.

How do I ensure that I'm building a good user experience for my product?

In Part 2 we reviewed how to test out lo-fi concepts to quickly learn whether or not potential solutions are viable. At this point you’ve successfully validated one of your concepts and you’re ready to start building out the full solution. You also should have a strong understanding of customer needs and the outcomes they want your product to accomplish.

If you’re building a digital product, you’re likely working with a Experience or Product Designer to translate those outcomes into concrete features and workflows within a product experience, and now you have a solid design for the product.

The next stage of customer validation is focused on usability — how easy or hard is it for people to use the product and achieve the outcomes they want? Usability Testing is important because it gives you the opportunity to clearly map out the user experience (UX) and observe users interact with it to be sure that they can use the product effectively.

Usability testing is most relevant for digital products and software. You can create a “clickable prototype” using a tool like Figma. The prototype is essentially a set of static screens that are linked to each other to give the impression of a working, live website or application.

To construct the prototype, choose the core workflows and journeys you expect people to take while using the product. Continuing from our example of a savings and investing app from Part 1, we might outline workflows like signing up for an account, linking bank accounts, entering saving goals, and viewing the calendar. Your prototype should include the major elements of the layout like the overall navigation so that it can be as faithful to the real experience you plan to build.

Before doing testing, create a series of tasks that capture how users will interact with the product. These tasks should correspond to each of the core workflows you chose (e.g. Task 1 might be asking users to create an account). You’ll ask users to complete these tasks within the prototype during the interview.

Part 3: Validating a User Experience with Usability Testing


  • Validate that potential users can effectively complete the core workflows in your product before you start building.
  • Uncover potential areas for improvement and friction within the user experience.


  • Ideal: High-fidelity clickable prototype tested across 10-15 interviews
  • Quick & Dirty: Medium fidelity prototype tested across 3-5 interviews

Interview Discussion Guide

  1. Mini-needs validation: Do a short version of the needs validation you did in Part 1 by asking them a few questions about who they are, their relationship to the problem space, and how they solve it today.
  2. Task Completion: Introduce tasks for users to complete one by one. Try to stay under 5 tasks per interview.
  3. Participant Narration: As users complete the task, as them to narrate what they’re doing and thinking out loud. Listen close for moments where they sound unsure or lack confidence about what to do next.
  4. Ease of Use: After each task, ask users how easy or difficult it was to complete the task and if there were any points in time that they were confused or unsure of what to do.
  5. Product Comprehension: When you’re finished with all tasks, do a mini-version of the product comprehension questions from Part 2: Concept Testing. Ask them to describe what the product does, who would use it and why, and how they would have used it to solve the problem the last time it occurred.

What signals should I look for in my interviews?

  • For each task, at least 75% of users can complete it successfully without needing your help. If many users are stumbling in a controlled environment it’s fair to assume that they’ll stumble when their using it out in the wild after you release your product.
  • 50% of users can describe how they would use the prototype the most recent time they encountered the problem. As we did with concept testing continue to check that after participants learn about your product they can identify how they would have used it the last time they experienced the problem you’re solving.

Examples of interviews with good signal and bad signal

Good Signal

Using our example of the savings and investing calendar app, here’s what an interview with good signal might sound like for the task of linking a bank account.

You: Next up, I’d like you to finish onboarding by linking your bank accounts. Please talk out loud and describe what you are doing as you interact with the prototype.

Customer: Ok. So to do that I would click here on “Link bank account.” I see that it pulls up a list of popular banking apps that I can link. I see my bank here so I click on it and it says that I can link it manually or by entering my login. I don’t know what manual linking is but it sounds hard so I’m going to choose enter my login. It would be great to if they gave more information on security before asking about my login…and so on.

You. [After the user has successfully completed the task] Let’s pause here. On a scale of 1 to 5 with 1 being not very easy and 5 being very easy, how easy was it to complete that task?

Customer: Four. It was pretty easy. I could understand what to do at each step.

You: Were there any times that you were confused or unsure of what to do?

Customer: It was pretty straightforward, but I was a little hesitant to enter my password and login information to my bank account. I was looking for more information on their security and privacy policy and I couldn’t find it. That would have made me feel better about linking it.

From this interview we know:

  1. The user can fairly easily complete the steps to link their bank accounts — great design!
  2. They expected to see security and privacy information and didn’t. You should consider adding this information to increase trust and reduce friction.


Poor Signal

Here’s what an interview with poor signal might sound like.

You: Next up, I’d like you to finish onboarding by linking your bank accounts. Please talk out loud and describe what you are doing as you interact with the prototype.

Customer: Ok. So to do that I would click here on…hmm where would I click. There’s a button that says “see bank options” and a button that says “enter bank information.” So is the option that I can set up a new bank account or use my existing ones? I’m not sure, so I’ll just try “enter bank information.” When I click “enter bank information” it wants me to choose my bank, but I don’t see it listed here. I’m not sure what to do, can you tell me what comes next?

You. [Give the user guidance on how to complete the task] Let’s pause here. On a scale of 1 to 5 with 1 being not very easy and 5 being very easy, how easy was it to complete that task?

Customer: Two. I was pretty lost because it made it seem like if you didn’t have one of those banks, you couldn’t link your account. If I didn’t have help I probably would have given up.

From this interview we can observe that:

  • The participant frequently sounded unsure or confused while speaking out loud, and wondered if they were doing the wrong thing.
  • The participant was not able to complete the task without help from the interview moderator, which suggests that the design is not effective for this workflow.
  • The participant said that if they had encountered issues with this task within the real product they would have abandoned sign-up altogether.

If a task has under 75% completion rate, you need to go back to the drawing board and rethink the user experience, and then test again. And if people generally get past the finish line but don’t sound confident as they do it, there could be room for improvement to really nail the experience.

Next Up: Validating Outcomes

If your prototype passed the threshold and got good signal for each core task — congratulations! You’re ready to take your product into development and ship it to customers once you’re finished building it. Now it’s time to move to the next phase: Validating Outcomes using A/B tests and Beta Releases.

Sign up for our newsletter

Get the best content on user research, and product building delivered to your inbox.

Thank you!
Your submission has been received!
Please complete this required field.