Chapter 6

3 real-life usability testing examples based on actual products

Get a feel for what an actual test looks like with three real-life usability testing examples from Typeform, ElectricFeel, and Movista. You'll learn about these companies' test scenarios, the types of questions and tasks these designers asked, and the key things they learned.

If you've been working through this guide in order, you should now know pretty much everything you need to run your own usability test. All that’s left is to get your designs in front of some users.

While it’s essential to learn about each aspect of usability testing, it can be helpful to get a feel for what an actual test looks like before creating your first test plan. Creating user testing scenarios to get the feedback you need comes naturally after you’ve run a few tests, but it’s normal to feel less confident at first. Remember: running usability tests isn’t just useful for identifying usability problems and improving your product’s user experience—it’s also the best way to fine-tune your usability testing process.

For inspiration, this chapter contains real-world examples of usability tests, with some advice from designers on writing usability tasks and scenarios for testing products.

Just arrived here? Here’s a quick re-cap to make sure you have the context you need:

  • There are many usability testing methods. Picking the right one is crucial for getting the insights you need.
  • Qualitative usability testing involves more open-ended questions, and is good for sourcing ideas or validating early assumptions. 
  • Quantitative testing is good for testing a higher number of people, which is useful for fine-tuning your design once you have a high-fidelity prototype
  • If it’s too difficult to organize in-person tests, remote usability testing is a fast and cost-effective way to get the info you need
  • Guerrilla usability testing is a great option for some fast, easy insights from real people
  • Ask usability testing questions before, during, and after your test to give more context and detail to your results.

Start testing your product

Maze is a usability testing tool that allows you to run quick and easy usability tests with your prototype from Figma, InVision, Marvel, and Sketch. Get started for free.

If you’re not sure whether you are at the right stage of the design process to conduct usability studies, the answer is almost certainly: yes

It’s important to test your design as early and often as possible. As long as you have some kind of prototype, running a usability test will help you avoid relying on assumptions by involving real users from the beginning. So start testing early.

The scenarios, questions, and tasks you should create, as well as the overall testing process, will vary depending on the stage you’re at. Let’s look at three examples of usability tests at different stages in the design process: early-, mid- and late-stage usability testing.

Early-stage usability test example: ElectricFeel

Product

ElectricFeel's product is a software platform for entrepreneurs and public transport companies to launch, grow, and scale fleets of shared electric bicycles and mopeds. It includes a mobile app for riders to rent vehicles and a system for mobility companies to run day-to-day fleet operations.

Feature being tested

When a new rider signs up to the ElectricFeel app, a fleet management team member from the mobility company has to verify their personal info and driver’s license before they can rent a vehicle. 

The ElectricFeel team hypothesized that if they could make this process smoother for fleet management teams, they could reduce the time between someone registering and taking their first ride. This would make the overall experience for new riders more frictionless.

Usability testing approach

The idea to improve the rider activation process came from a wider user testing initiative, which the team saw as a vital first step before they started working on new designs. Product designer, Gerard Marti, explains:

After comparing the results of user persona workshops conducted both within the company and with real customers, the team used the insights to sketch wireframes of the new rider activation user interface.

Then Gerard ran some usability tests with fleet managers to validate whether the new designs actually made it easier to verify new riders, tweaking the design based on people’s feedback.

The next step in their process is conducting quantitative tests on alternative designs, then continuing to test and iterate the option that wins with more quantitative testing. Gerard sees quantitative testing as a vital step towards validating designs with real human behavior:

Test scenario

“You have four riders in the pipeline waiting to be accepted.”

Gerard would often leave the scenario at just this, as he wanted to observe the order in which users perceive each element of the design without sending them in a direction with a goal.

Tip 💡

When testing early versions of designs, leaving the usability test scenario open lets you find out whether users naturally understand the purpose of the screen without prompting.

Task and question examples

To generate a conversational and open atmosphere with participants, Gerard starts with open questions that don’t invite criticism or praise from the user:

  • What do you see on the screen?
  • What do you think this is for?

He then moves on to asking about specific elements of the design:

  • What information do you find is most valuable?
  • Are pictures or text more important for you?

By asking users to evaluate individual elements of the design, Gerard invites participants to give deeper consideration to their thought process when activating riders. This yields crucial insights on how the fundamentals of the interface should be designed.

The key thing they learned

Mid-stage usability test example: Typeform

Product

Typeform is a people-friendly online form and survey maker. Its unique selling point is its focus on design, which aims to make the experience for respondents as smooth and interactive as possible. As a result, typeforms have a high completion rate.

Feature being tested

Since completion rates are a big deal for Typeform users, being able to see the exact questions where people leave your form was a highly requested feature for a long time. Typeform’s interface asks respondents one question at a time, so this is especially important. The feature is now called ‘Drop-off Analysis’.

Product tip ✨

Before you even start designing a prototype for a usability test, do research to discover the kind of products, features, or solutions that your target audience needs. Maze Discovery can help you validate ideas before you start designing.

Usability testing approach

Yuri Martins, Product Designer at Typeform, explains the point when his team felt like it was time to test their designs for the new Drop-off Analysis feature:

Fortunately, they had already contacted users and arranged some moderated tests one or two weeks before this point, anticipating that they’d need user feedback after the first design sprints. By the time the tests rolled around, Yuri had designed “a few alternative ways that users could achieve their objectives” in Figma.

Since the team wanted to iterate the design fast, they tested each prototype, then created a new updated version based on user feedback for the next testing session a day or two later. Yuri says they “kept running tests until we saw that feedback was repeating itself in a positive way.”

Tip 💡

Finding participants is often the biggest obstacle to conducting usability tests. So schedule them in advance, then spend the following weeks refining what you’d like to test.

Test scenario

“One of your typeforms has already collected a number of responses. The info you see appears in the ‘Results’ page.”

This scenario was designed to be relatable for Typeform users that had already:

  1. Made a typeform 
  2. Shared it and collected responses
  3. Visited the ‘Results’ page to check on their responses. 

Choosing a scenario that appeals to this group of users ensured the feedback was as relevant as possible, as the people being tested were more likely to use the Drop-off Analysis feature to analyze their typeform’s results further.

Task and question examples

Typeform’s Drop-off Analysis prototypes only existed in Figma at this point, which meant that users couldn’t interact with the design to complete usability tasks. 

Instead, Yuri and the team came up with broader, more open-ended tasks and questions that aimed to test their assumptions about the design:

  • Tell us what you understand about the information on this page.
  • Describe anything missing that you would need to fully interpret the interface.

After the general questions, they asked questions about specific elements of the design to get feedback where they needed it most:

  • At the drop-off point, what do you understand?
  • What would you expect to see here?
  • Does this information make sense to you?

This example shows that you don’t need a fully functional prototype to start testing your assumptions. For useful qualitative feedback midway through the design process, tweak your questions to be more open-ended.

Product tip ✨

Maze is fully integrated with Figma, so you can easily upload your designs and create an unmoderated usability test with your Figma prototype. Learn more.

The key thing they learned

Late-stage usability test example: Movista

Product

Movista is a workforce management software used by retail and manufacturing suppliers. It helps its users coordinate and execute tasks both in-store and in the field with a mobile app. 

Feature being tested

As part of a wider design update on their entire product, Movista is about to launch a new product for communications, messaging, chats, and sending announcements. This will let people in-store communicate with people out in the field better.

Usability testing approach

Movista’s new comms feature is at a late stage of the design process, so they tested a high fidelity prototype. Product designer, Matt Elbert, explains:

By this point, the team were confident about the fundamental aspects of the design. These tests were to iron out any final usability issues, which can be harder to identify during the process. By testing with a higher number of people, they hoped to get more statistically significant results to validate their designs before launch.

The team used Maze to conduct remote testing with their prototype, which included an overall goal broken down into tasks, and questions to find out how easy or difficult the previous step was.

Test scenario

You have received new messages. Navigate to your messages.”

The usability tests would often begin in different parts of the product, with participants given a clear navigational goal. This prompts people to act straight away—without getting sidetracked by other areas of the app.

Matt advises people to be specific when using testing tools for unmoderated tests, as you won’t be there to make sure the user understands what you’re asking them to do.

Task and question examples

The general format of the usability test was giving people a very specific task, then following up with an open question to ask participants how it went.

  • How would you delete the message, “yeah, what’s up?” that you sent to Mark Fuentes.
  • How did you find the experience of completing that task?

Matt and the team would also sometimes ask questions before a task to see if their designs matched users’ expectations:

  • What options would you expect to be available in the menu on the top-right corner of the message?

“Questions like this are super useful because this is such a new feature that we don’t know for sure what people’s priorities are," said Matt. The team would rank people’s responses, then consider including different options if there was consistent demand for them.

Finally, Matt says it’s important to always include an invitation for participants to share any last thoughts at the end:

The key thing they learned