Feb 21, 2018
If you're new to Maze, welcome aboard!
Here's seven tips on how to create a great experience for your testers while getting the most out of Maze.
Remote testing sessions are very different from face-to-face interviews: testers aren't dedicating a hundred percent of their focus to your product. That's great news because in real life your users won't either.
Users give away some of their free time in participating, so their attention will drop significantly the longer your maze lasts. A good rule of thumb is to try to keep your number of missions under seven, and the average time to complete under five minutes.
💪 Pro-tip: Try sharing your draft maze with a co-worker, track the time it takes them to complete it, and if it's over five minutes, you might need to simplify your maze.
Chances are that your testers are very new to the concept of prototyping and user testing. To help them, we've added the following warnings at the beginning of a maze:
✔︎ This is not the final product, only a succession of interactive pages
✔︎ You will never be asked to type or do a particular gesture, only clicks
✔︎ If something doesn't respond on click, it's not clickable
Even with these warnings in place, it's common to see patterns of user frustration during the first mission (misclicks, longer time spent on pages, bounce, etc).
A great way to introduce the concept of prototyping (and Maze) to your testers is to start with a simple and straightforward mission: a 3-slide clickable walkthrough for your app has proven to work wonders.
💪 Pro-tip: Check out our article on how to introduce Maze to your testers.
You should think of your mission description as a tweet: it gives a general purpose without going into too much detail. If you find yourself writing more than 140 characters, you're either:
Your product has been crafted to be used a certain way, so a great practice is to follow the product's natural user flow and avoid jumping from unrelated parts of your product between two missions.
To achieve this effect, try as much as possible to have your missions start with the previous mission's end screen: doing so will help to avoid confusion.
This is an obvious yet very important one: make sure your prototype doesn't have:
If a tester gets stuck, they are 20% more likely to bounce instead of giving up your missions.
Unless the variable you're testing for is the ability for testers to understand your product's internal language, use broad, general terms to describe actions.
✅ Do: "Post a new status update!"
❌ Don't: "Send a Wuphf!"
A great way to make the most out of the collected data is to create a sheet where you can define what you expect for each mission, KPI-wise.
Examples of mission KPIs:
Direct success: > 75%
Average time: < 12s
Misclick rate: < 20%
After your testing session is complete, compare your expectations to the collected KPIs and see where your design can be improved.