Designing voice user interfaces is increasingly becoming part of user experience professionals’ job descriptions. While voice assistants and other audio-based tools mimic some of the user flow of a typical UX project, there are some unique differences. This is particularly true when it comes time to test the voice interface itself. 

To help, we’ve put together 4 key recommendations on how to get started testing your voice user interface designs. 

Have a conversation. Voice assistants aren’t yet sophisticated enough to hold a typical human conversation, where they remember details from previous conversations or understand slang. But they should be able to follow a set pattern of prompts and responses to produce the user’s desired outcome. One of the most effective ways to test this is to “perform” the script you’re creating for the application as though you’re speaking to another person. Have one volunteer read the part of the assistant while you follow the typical patterns of a human user. If you hit dead ends or get stuck in loops, you’ve spotted some areas for concern. UX designer Cathy Pearl discusses more about this process in her excellent resource Designing Voice User Interfaces.

Look for gaps in information. When we talk to other humans, we don’t always include all the necessary information in our sentence. This is because we’re able to detect context clues and other details that an automated assistant can’t. For example, voice assistants won’t necessarily understand the difference between a question like “How far is McDonalds?” and “How far is the nearest McDonalds from me?” Figuring out these gaps through user testing can help us identify moments when the voice assistant will need to request additional information in order to complete their task. 

Experiment across different ways of speaking. Though voice interfaces have improved their ability to detect human speech, there are still far too many instances in which people aren’t able to use these tools due to accents or altered speech patterns. This can be embarrassing and frustrating for the end user and make it less likely that others will use the tool. To help counteract this, it’s important to incorporate different ways of speaking into the user testing process. Just as you would want diverse users to test a digital interface, it’s important to include many different types of speech into the testing of a voice interface.

Automate testing when appropriate. While human user testing is critical for voice interfaces, there’s also a role for automated testing. For example, there’s no way that a human tester can replicate the millions of different variations in human speech that a voice assistant needs to handle. Similarly, human testers can’t generate every possible request or response that a voice assistant is likely to receive. For that, automated testing tools can help. Two of the most popular voice user interface testing tools are the open source tool Botium and the paid tool Bespoken. You can read more about these two tools in this article from Emtec Digital.

Voice user interfaces will continue to evolve and improve over time. That’s why testing these interfaces will grow even more important to ensure they deliver accurate and reliable results in as many different situations as possible. 

Share Share on FacebookShare on Google+Tweet about this on TwitterShare on LinkedIn


By submitting this form, you are consenting to receive marketing emails from: Clear/Point. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Constant Contact