placeholder

Remote user testing with Maze

Fortunately, it has become standard: repeated validation by end users to avoid delivering something at the end of a large development project that your target audience doesn't understand or need. But smaller projects also benefit from user-testing. However, the costs can be high, especially if you conduct user testing "on-site. Remote user-testing lowers that threshold and has become extra relevant in these days of social distancing. That is why our User Experience Director Roel shares his experiences with Maze, a fine tool to get insights from users quickly and inexpensively.

When we started using Maze for user testing a few years ago, we were pleasantly surprised at how smoothly it works. So we recently decided to use it again for two user-tests of a new app from HVC. As it turns out, the team behind Maze has not been idle as the user experience and functionality has been further improved in the meantime. A short report of my experiences that you might benefit from.

Test 1: concept and design

In the first test, we examined the concept ("Is the user waiting for this?") and tested the design ("Is what we came up with clear to the user?"). For this, we used a design created in Sketch and a prototype built in InVision. In Maze, we then entered the appropriate paths as the 'mission' for the users and presented them with a number of question-related queries.

Diagram van Sketch en InVision naar Maze

After a few days we already had a fairly complete report with test results; what did the testers understand, what functionality could they not find and which parts were questionable? Maze often presents the results in handy graphs in which the distribution quickly becomes clear (see image below).

Voorbeeld van de weergave van een resultaat in Maze

After an initial analysis and removing some less relevant outcomes, we were soon able to publish a report that was good enough to walk through with the Product Owner and gain insights. Because you need a fair amount of context to get from results to insights, I don't recommend just forwarding the report to your stakeholders.

Test 2: the beta version

In the second round, we tested a beta version of the actual app. In it, we mainly tried to get answers to the questions, "Are there still ambiguities for the user?" and "Is the app being used as we imagined? Here the test script and feedback processing ran through Maze. This test brought to light a few more major ambiguities that we were able to immediately take into account in further development.

Conclusion

Of course, face-to-face interviews remain our preference, mainly because they allow us to pay more attention to the respondent's guidance and non-verbal communication. Still, Maze is a nice and accessible alternative, especially if you have a group of testers 'close by' and do not have to spend much time on guidance.

Advantages

  • An important advantage of remote testing is of course that people can run the test on their own device, at their convenience, wherever they want.
  • The usability of Maze is increasing with the rapidly growing number of question types. Of course, you can ask yes/no and multiple choice questions and you can ask open-ended questions for immediate feedback. But there are also less common formats available like likert-scale or card-sorting. So you can still choose fairly precisely the format that works best for your specific question.
Vraagtypen in Maze

Disadvantage

The only real drawback we found is that Maze does not yet work well with Sketch Cloud Prototypes' fixed elements. Consider, for example, the menu bar that remains present at the bottom in a mobile view. Fortunately, Invision offered a solution to that.

Want to know more about user testing?

Don't want to wait but want to learn more immediately about how to use (remote) user-testing to design an optimal user experience?