The complete guide to usability testing
How to ask effective usability testing questions
Asking the right questions in usability testing is vital. In this chapter, we share examples of usability testing questions to ask before, during, and after the test, and best practices to keep in mind to get effective results.
Ask effective usability questions with Maze
Maze is a remote user research platform that enables you to run surveys, validate design ideas, and test your prototypes—all in one place. Get started for free.
There are two ways to gather info with a usability test: observing participants completing pre-defined tasks, and asking participants usability testing questions.
While the former serves as the core of your usability test, questions are a great way to get feedback on specific elements and pain points of your design. Often with user testing, there are one or two aspects in a test prototype that as a designer you’re particularly keen on validating. Asking questions can prompt a user to reflect on certain parts of the user experience and voice their feedback.
And outside the usability test itself, questions allow you to know the person behind the user. Which demographic they’re in, how familiar they are with technology, and what they think of your product overall are useful bits of info that add context and detail to your results.
All this is vital for getting the most out of each testing session. Since finding participants is always a challenge, every test counts.
So even if you can’t test as early and often as you’d like to, asking good questions is a path to getting the insights you need. In this chapter, we’ll break down what to ask, when to ask it—and most importantly, how to ask questions in a way that gets accurate results.
Let’s start with pre-test questions.
Finding participants for user testing can be tough. So it’s tempting to take people up on their offer to participate as soon as you find them.
But remember: not all test participants will give you equally relevant results. One usability testing best practice is to test people who resemble your target audience, as this will give you an accurate idea of how to improve your design from real potential users.
To find out more about your participants, you have two types of user testing questions at your disposal: demographic questions and background questions.
Demographic info gives your results more context, allowing you to spot usability trends across different demographics, ages, nationalities, genders, income groups, etc. Ask carefully though—demographic questions can touch on sensitive topics for some people. Here’s a few examples of well-worded demographic questions to ask as part of your user research.
Asking demographic questions carefully avoids making assumptions about people by mistake. It also gets you the context you need—without making people uncomfortable with overly direct or specific requests for personal info. Getting off to a positive start will help your participants feel relaxed when it comes to taking the test.
Apart from finding out demographic info, it’s also smart to ask some screening questions related to people’s product habits and preferences. If someone’s using your competitors’ products, for example, you should take this into account when analyzing their results. This kind of info is also useful when thinking of post-test questions to ask specific people.
On the other hand, a participant who’s completely new to your product might spot usability issues in your design that are intuitive to long-time users. So it’s important to find out before you start the testing session.
Here are some background questions to ask your participants:
By deploying the right demographic and background questions pre-test, you make sure you choose the right test participants, and you’ll have more information with which to analyze your results afterwards. This can help explain potential anomalies in results, and also gives you info to make your design more accessible to a wider variety of people.
Asking questions during a usability test helps pinpoint design issues by probing a little deeper. However, it’s also important not to ask too many questions so the participant can complete the test with minimal distraction. To avoid having to rely on questions too much, make sure you write some great usability tasks.
The number one rule of usability test questions is no leading questions. A leading question influences that participant to respond in a certain way, skewing your results. Instead, you should carefully word your questions so they’re neutral and open.
Here’s a few examples of questions you should not ask during a usability test:
By using words like ‘simple’ and ‘clear,’ you can unintentionally plant ideas into the participant’s head. So avoid adjectives whenever possible, and phrase questions in a way that invites participants to share their thoughts openly.
Here’s how you should phrase the questions above:
Notice that as well as not leading the participant, the questions are open-ended. Exclusively asking questions that require a ‘yes’ or ‘no’ answer isn’t ideal for a couple of reasons. First, your results will lack the detail and depth you need to make improvements—if a participant is struggling with specific tasks, you need to know why. Second, people might be tempted to answer ‘yes’ just to avoid having to explain themselves, even if in reality they’re having problems.
As inspiration, here’s some more examples of well-written questions to ask during the test:
Taylor Palmer, UX Director at Lucidchart, shares examples of usability testing questions they ask during a session:
The features we test are often data-driven and highly technical. Because of that, we’re usually asking about a participant's understanding of the content, how they would like to adjust it, and their ability to navigate it.
While every usability study used to take place in person, these days technological advances have made unmoderated tests a viable way to get results fast.
Product tip ✨
Maze makes it easy to create usability tests, build sequences of tasks around them, and write questions for users to answer— before, during and after the test.
The type of questions you should ask are generally similar regardless of your testing process. But there are a few things you should be aware of depending on the approach you choose.
If you’re doing a moderated usability testing, you have the opportunity to follow up on anything the participant does that you find interesting. Use this to your advantage by asking specific questions based on user behavior:
However, just because you’re moderating the test, that doesn’t mean you should bombard the participant with questions throughout the test. It’s important to let the user feel relaxed and complete the test in their own time, so test conditions more accurately reflect a natural situation.
Kara Pernice from Nielsen Norman Group recommends following these steps when considering whether to speak to a user:
As a general rule, take your time working out exactly what you want to ask someone, how to phrase it correctly—and whether it’s worth asking at all. Jumping in at the wrong time could catch the user off-guard, potentially compromising their progress for that part of the test.
On the other hand, the unmoderated approach requires questions to be written in advance. It’s important to test your questions on people before you send them to users, as you won’t be there to clarify if people get stuck.
And while you can still ask some open-ended questions, providing scales or multiple-choice options for people to respond prevents participants from having to write long answers. Here’s a few examples:
Finally, using simple and straightforward language is always best practice with any usability test—but it’s especially important when you’re writing questions for unmoderated testing. If people interpret a question differently, it’ll impact your results in an unpredictable way.
So avoid jargon and remove any internal placeholder terms your design team has been using. And if in doubt, ask a copywriter.
Once the participant has made it to the end, it’s a good time to ask broader follow-up questions about their overall experience. Since you’ve already asked about specific design details as they came across them, these questions can be more open-ended and could even include opportunities for the user to suggest improvements.
Here’s a few examples:
Depending on the type of testing you’re doing, you might also want to conduct a more formal post-test UX survey. Using a standardized format with rating scale questions is a good way to gather quantitative data on the overall usability of your designs—particularly important if you’re at the summative testing stage.
Industry-standard scoring systems like the System Usability Scale (SUS) offer a straightforward approach, as you can use an already existing survey template. And since the SUS is used to measure the usability of many products, basing a post-test survey on it allows you to compare scores with other similar products—or even earlier versions of your own design.
Finally, after you’ve thanked the user for participating, give them an opportunity to air any final remarks they might have. You never know—they might say something that sparks an idea for the next big feature.
Conducting usability tests can take considerable time and effort—and you only have a limited window to probe each user’s thoughts on your user experience design. Asking the right questions at the right time lets you squeeze as much juice out of each test as possible.
Get as much background info as you need, practice your wording to avoid leading questions, and think about the type of data you want to come away with at the end. Questions are one of the main techniques UX researchers have to interact with real users—so use them wisely.