There’s all sorts of guidance around how many participants deliver the optimum level of insight, but generally for three tasks (which tend to be around an hour long) you’d want to test with between 3 and 7 people. The ‘80/20’ rule can apply here, where a relatively small number of tests highlights common patterns in behaviour and feedback, revealing the majority of issues across a site or application.
Recruitment
When recruiting participants for your testing project, refer back to your goals and segment in line with what you know of your target audience/core user groups – personas and analytics can be helpful here to refine your demographic split. You may also want to recruit across current and prospective customers so you’re not only getting insight from those familiar with your brand, and could even consider speaking to those who have previously had a bad experience with you, to capture some truly honest feedback!
While you can (and likely should) mine your own database for usability testing participants, working with a dedicated recruitment company or usability testing specialist gives you access to a wider additional pool of testers. These partners will also manage the end-to-end process for you, from booking session slots to providing incentives for taking part.
Documentation / materials
Your goals will additionally shape the tasks your participants complete, which you’ll need to outline in test scripts for your testing facilitator to follow; these keep sessions consistent and help ensure they deliver the focused insight needed.
Test scripts typically include repeatable instructions that can be used across testing sessions. For example, if you are testing a purchase journey for an ecommerce site, tasks may include navigating to a particular product, adding it to the basket, setting payment and delivery details, and completing the purchase.
Your test script can also cover nudges and prompts to support participants should they get stuck, however it’s important to leave these as open as possible to avoid leading participants and influencing results. So rather than asking a closed question such as ‘did you click on that button because it was green?’ – which invites a one-word, affirmative answer – try asking a more open question such as ‘what was the reason you clicked on that button?’.
In addition to thinking about how you’ll write your usability tests, you’ll need to prepare diaries for the user testing sessions to record feedback and additional observations (more on which shortly), and ensure you have assets for your participants to test with. Whether this is a paper prototype, coded prototype or developed website / application, it’s important that all your documentation and materials are aligned, so be sure to review them – and conduct a practice if possible – in advance of your sessions.
Preparing for your testing sessions
When preparing for your testing sessions, make sure you have someone in place to act as the facilitator – taking participants through the tasks and making them feel relaxed and comfortable – and someone to act as the observer throughout the session. The observer is responsible for recording participant feedback and monitoring nuances around expressions, body language and other non-verbal feedback, and will generally be sat outside the room observing via a remote video / audio link, to maintain a natural environment for the participant.
It’s also important to ensure you have all the equipment you need in advance of the session, which may include laptops / tablets / smartphone devices, camera and audio equipment, and eye tracking software, as well as access to the site or prototype you’ll be testing with. Check all equipment well in advance, as well as after you complete your first session – it’s better to find issues at this point rather than once you’ve completed your testing activities!
Another crucial element of your session prep is your communication with participants, so that they are aware of where they need to be and when, and what will be required from them. Think about whether you’ll be sending digital or physical invitations and how you’ll manage other documentation, such as consent forms and pre-session questionnaires.
Running your testing sessions
By taking the time to plan and prepare your testing programme, the process of running your sessions should be relatively straightforward – but certainly not without its own set of skills.
Common techniques for testing include:
A good facilitator will be experienced in encouraging participants to explore freely, and responding to their actions with relevant and insightful questions to understand why they’re doing what they’re doing – linking back to the need for open rather than closed questions.
An additional tool to capture participant feedback is the System Usability Score (SUS) survey. Comprising 5-10 questions about various aspects of the overall experience, this is a great way to quickly capture valuable insight, and provides a clear score that’s perfect for benchmarking and to communicate what you’re doing to senior stakeholders.
How to analyse your test results
One of the best things about usability testing is that you can start learning from your sessions instantly.
Whole team analysis activities such as clustering notes on observations and participant feedback – whether using physical post-it notes, or tools such as Miro – will help reveal patterns in behaviour and common pain points where you can focus your efforts. You may also find outliers that require further investigation, for example where participants aren’t as familiar with digital technologies and may be coming to your product or service for the first time.
There are many frameworks out there to help quantify the issues you uncover, such as Rose, Thorn and Bud:
These techniques will help you build a prioritised and actionable backlog of work, covering fixes to broken elements of your site or application along with new features to enhance the customer experience.
A workflow pattern for usability testing
I’ve taken you through a general workflow pattern for usability testing, but within this you will need to tailor the approach, activities and tools you use – driven always by your wider strategic and tactical goals.
At Box UK we’ve written a free white paper on common testing mistakes to avoid, drawing on our team’s experience conducting hundreds, if not thousands, of hours of usability testing for clients across a wide range of industries. Visit https://www.boxuk.com/insight/ten-common-usability-testing-mistakes/ to download your copy, and if you’re ready to start shaping your testing project, get in touch by emailing ux@boxuk.com or calling +44 (0) 20 7439 1900 to find out how we can help you.