In this video we cover:
What unmoderated usability testing is.
Pros and cons of unmoderated usability testing.
When (and when not) to use unmoderated usability testing.
How to set up and run a quick unmoderated usability test using Lyssna, a user research platform that makes it easy to conduct research, recruit participants, and analyze data.
Unmoderated usability testing lets you get real, honest feedback from users without needing to be there. This makes it cost-effective and scalable. Whether you're an experienced UX professional or just starting out, this video will give you the tools you need to run effective usability tests.
Ready to get started? Watch the video to learn more!
Chapters:
0:00 - Introduction
0:32 - What is unmoderated usability testing?
1:42 - Pros and cons of unmoderated usability testing
2:42 - When to use unmoderated usability testing
4:11 - Setting up an unmoderated usability test with Lyssna
8:22 - Reviewing the test results
9:50 - Wrap-up
Resources:
Get started with the Lyssna template library
Read our usability testing guide for more tips
Transcript
Are you looking to fine tune a new product, validate a design decision, or simply want to understand your users better? Unmoderated usability testing can help. You can get real, honest feedback without even needing to be there. My name's Diane, and in this video we'll be exploring unmoderated usability testing.
We'll cover what it is, it's pros and cons, when and when not to use it, and I'll also show you how to set up and run a quick, unmoderated usability test using Lyssna. Let's get into it.
Usability testing is a technique you can use to evaluate how easily users navigate and interact with your product or service. By asking real people to complete specific tasks, you can identify any issues and areas for improvement. For example, if you're trying to decide between two different design approaches for your product, you could run a preference test.
You'd show participants both designs and ask them which one they prefer and why. You [00:01:00] might ask questions like, which design do you find more visually appealing and which layout do you think is easier to navigate? Analyzing their feedback will help you work out which design better meets user preferences and expectations, guiding your design choices.
It can also be a great way to settle internal design debates. Usability testing can be conducted in person or virtually, and in a moderated or unmoderated environment. When we run an unmoderated usability test, we're asking participants to complete these tasks independently, without us actually being present to facilitate.
This approach allows users to engage with the product in their natural environment, when it suits them, and provide authentic feedback.
Like with any research method, there are some pros and cons to unmoderated usability testing. In the pros column, it's cost effective and scalable. Whether you're testing with your own network or recruiting from a research panel, you can set up a test relatively affordably and [00:02:00] test with a large group of users without needing to facilitate.
It's flexible. Your test participants can complete the test at their convenience, leading to more natural interaction. And there's a quick turnaround. Results can be gathered really quickly, allowing for faster design iterations. Some of the cons include that there are limited qualitative insights.
Without actually being present, it's harder to ask those follow up questions about users thoughts and motivations, or observe their behavior or expressions. There's also potential for misinterpretation. Participants might misunderstand tasks and they're not actually able to ask for clarification. You also have less control over the testing environment and conditions.
Bearing in mind the pros and cons, it's important to weigh up when unmoderated usability testing is the right choice for you. Unmoderated usability testing is ideal in situations when you need to test with a large group of users, quickly and cost effectively. Like if you're setting up a new [00:03:00] ecommerce platform and you want to test the checkout process with a really broad user base.
It's also ideal for tasks that don't require real time guidance or clarification, like the preference test I mentioned earlier. It can be a good option when you want to collect quantitative data such as task completion rates and time on task. This would be ideal if you wanted to measure the time it takes for users to complete a task like locating an item in your ecommerce shop or looking for a particular page on your website, or signing up to a subscription of your product.
And unmoderated testing is also really ideal when you want to test users in their natural environment to gain realistic feedback. Unmoderated Unmoderated usability testing might not be the right choice when you need detailed, qualitative insights and the ability to ask follow up questions. In that case, a user interview might be the better choice.
It's also not great if your tasks are complex and require real time guidance or clarification, or if you need to observe users reactions and behaviors closely. It also might not be the right choice if you really want to control the environment to create that consistency. Okay, now that we've got a basic understanding of what unmoderated usability testing is and when it can be helpful, let's take a look at how to set up a quick test using Lyssna.
Here I am on the Lyssna dashboard and I'm going to go to create study. From here I can select test or survey to run an unmoderated test, or I can head to the template library. In the library I can browse all of the ready-made templates, or I can sort them by use case, methodology, or team. I mentioned preference testing in the introduction to this video, so I'm going to choose preference testing from the options here, and then I want to choose this test here.
So this test is designed to help you understand out of two different options which one your users prefer. It's a good choice if you're not actually sure which direction you want to go in because you can gather some real user feedback and then back up your decision with data. So I'm going to choose use this template and then here I am in the test builder.
So I can rename my test. I can then choose which language I'd like my participants to test in. I can save the test to a project and I can also choose which devices I want them to test on. I can choose to enable a screener to my test and this is going to allow me to ask specific questions to either qualify or disqualify participants based on certain targeting criteria.
And I can also customise the welcome screen if I wanted to. I can add a new title, message and start button text. But even if I don't do this, my participants will still get some guidance on what to do. And I'm going to show you what this looks like in a second. So next I can add an instruction to my preference test.
So in this example we have two designs and we want participants to choose which one they prefer. So I can easily remove the images from the template and upload my own and I can then randomize the order that the images are displayed in. You can actually choose up to six images in a preference test, but when you have two options the results will show you the popularity of each design and you'll also see a calculation of the statistical confidence of the preferred design. And I'm going to show an example of what this looks like soon.
So next you can ask some follow-up questions, and this is a good opportunity to gather some qualitative feedback. So here we're asking some open text questions like, why did you choose that design? And a linear scale question. How likely are you to click on this social media ad? But you can add any questions you like.
So you can either replace the questions we have in the template by deleting them, and then going down and adding new questions. And if the order of your follow up questions isn't that important, you can choose to randomise them. And this can be a really good way to reduce bias in the responses you receive.
So from here I can save and preview my test, and we would really recommend that you do this so that you can see what your participants are going to see. So I mentioned earlier that participants will see a welcome message, they're then prompted to start the test, and they'll see the instruction that we added.
So they can view the options, click the design that they prefer, and then they'll be prompted to answer the follow up questions. So I'm just going to close the preview now. Once you're happy with how your test is looking and you've made any changes you need to, you can click save and continue. So from here, you can either choose to recruit from your own network by setting up a recruitment link, or you can recruit from the Lyssna panel.
So here I am in the order screen, and I can choose the number of participants I want to recruit for my study. And then I, if I wanted to, I could also specify certain demographics. In the right-hand column, you'll see a summary of your order. So you'll see all of the specifications you set in your test, you'll see how much it's going to cost you, and you'll also see an estimated turnaround time.
So I'm going to pause the video here and wait for my results to come in. Okay, so my order took around 5 or 6 minutes, and I received an email to let me know that my results were in. So I can see straight away that option 2 is performing better, but that the difference isn't that significant. Although this does still give me a pretty good sense of what my target audience prefers, and I can dig into their responses a little bit more to understand why they chose the design they did.
So I can filter the responses for people who chose each option just by clicking on the little filter icon here. And then if I wanted to, I could also tag their responses by clicking on the little tag icon here. You can see I've already started to tag one of my responses here. So I can see people, like I said, mentioning professionalism, things like the color palette, clear information, It's catchy, and so on.
So next I could move forward with this design feeling pretty confident about how it would be received, or I could make some iterations based on the feedback and do another round of testing. For example, I might compare two designs in this style, but with different CTAs or registration buttons. So that wraps up how to run a quick, unmoderated usability test in Lyssna.
It was really easy to set up using a template, and I had my results in a matter of minutes. I hope this video has been a helpful introduction to unmoderated usability testing. There are so many things you can test with unmoderated usability testing. Our template library is a really good place to start if you're looking for inspiration or use cases.
But if there's anything in particular that you'd like us to walk through, please do feel free to let us know by leaving a comment on this video. If you found this video helpful, please do give it a thumbs up and subscribe to our channel for more usability testing and research tips. Thank you and see you in the next one.
Try for free today
Join over 320,000+ marketers, designers, researchers, and product leaders who use Lyssna to make data-driven decisions.