The best conversation experiences start with best-in-class user testing.
The conversation design process isn't always a linear path to launch. Often, many designers spend weeks prototyping, collecting feedback, and iterating on their designs before going live. User testing is the not-so-hidden secret for many design teams to take their projects from ideation to full production – and our newest feature, Transcripts, aims to make that easier than ever.
Transcripts are a brand new way to organize & analyze all of your user-testing in one place.
Whether you're a team running internal testing or a group of users/stakeholders interacting with early prototypes – user testing is an integral part of the design process. Transcripts allow teams to get better insight into their users' problems by logging their conversations and making it easier for teams to identify which parts of the experience best complement their users.
Why is user-testing important?
User testing closes the design loop by allowing teams to validate their original hypothesis and match it to the design that they've created. User testing is a critical part of the design flow. Teams can now take their original prototype, share it with external or internal stakeholders and test whether or not it solves the problem at hand. This provides an intimate feedback loop, allowing teams to get better insight, categorize their mistakes and opportunities, and, more importantly, design better experiences.
This process also allows teams to break up parts of their workflow. First, it focuses on shipping their MVP (minimum viable product) and then on validation (user-testing), iteration, and launch.
The modern customer experience requires an incredible emphasis on user experience. While many teams spend countless sprints in the planning or creation phase, those focusing on testing live often come out ahead.
By testing your experience with real users or stakeholders, you're able to generate real-time feedback and mix both quantitative and qualitative data to improve your experience. Using features like transcripts, you can:
- Test your project as many times as you'd like
- Review the conversations and note/highlight what matters most
- Attribute positive or negative experiences to tickets or instances directly
- Identify quantitative metrics, like confidence score and correlation to errors
- Tag teammates to add missing utterances or intents
- Last but not least, encourage teams to test more – making it easier for them to run more rapid prototyping with a click of a button
What does this unlock?
It's no secret that testing helps create better user experiences. However, in this instance – transcripts unlock the opportunity for designers to get in front of more people, categorize their thoughts, and run tests asynchronously. This not only frees up time for customers to complete tests on their own agenda but also unlocks opportunities to test both guided and unguided instances.
This degree of flexibility empowers teams to achieve better, less biased data – unlocking opportunities to improve conversational experiences regardless of timezone or proximity.
Transcripts is the first of many features we plan on releasing for user testing. This first stage gives teams insight into what's happening when people are engaging with a live assistant. We want to make it possible for teammates to review each conversation or contact and leave in-line notes or highlights to better influence the overall design cycle.
With this first stage, we're committed to providing teams tools to improve their conversation experiences, allowing them to identify gaps in their flows, improve no match intents, or no collectively design dynamic repair paths – the choice is yours.
In the future, we are working toward automating much of this process. Allowing teams to review transcripts on a dime. Automatically identify gaps and opportunities through the form of smart suggestions and AI-powered grouping. Our goal is to create a seamless and powerful experience for both builders and testers, and that starts with giving teams the power to run faster and more organized testing.
Furthermore, we want to empower all teams – no matter their size, to train their models with real data. This means an increased focus on democratizing how we generate tests, share links, and learn from real conversations. This mix of real conversation data and the flexibility of choosing your own NLU/NLP or assistant experience will allow more teams to proactively recommend solutions and identify gaps in real-time.
While user testing remains a pivotal step in many conversation design teams and workflows, it's important to continue to work to make it easier and more organized. We hope that with the release of Transcripts, we're making strides towards improving the user-testing workflow and making it easier for people to learn directly from their customers. Because after all, much like conversations – conversation design is far from one-sided.
If you'd like to learn more about Transcripts or user testing with Voiceflow, you can read the change log here.
Or, if you're a team looking to get a demo of what building with Voiceflow could look like, don't hesitate to reach out and book a demo with our team.