Based upon the idea that a computer needs specific information to understand human language, a conversational ai interaction model provides the necessary information for a computer to understand and process a given voice request or command.
The interaction model should define how each part of your conversational assistant interrelates in a way that mirrors true user interactions. The interaction model ensures that users know how to move from path to path within your conversational assistant and it helps align conversation designers, developers, and external stakeholders to a common understanding of each pathway within a conversational assistant and the overall goal of the assistant.
The interaction model is your blueprint for your conversational assistant - it’s the machine that powers natural conversations.
When using an interaction model, conversation designers aren’t creating linear conversations that follow a decision tree.
Interaction Models for conversational AI assistants are modeled after models which mimic human conversation - where participants alternate positions as sender and receiver and generate meaning by sending messages and receiving feedback within physical and psychological contexts (Schramm, 1997).
Your interaction model should be informed by the initial user research done by your design team. By nature, interaction models are lower fidelity and more strategically focused than dialog management or NLU models, which we’ll cover later on in this post.
If we consider what interaction models really are, we begin to realize that they are nothing less than the designer’s attempt to help the user generate a mental model of the device.
Your interaction model doesn’t have to include every nuance to every conversation a user could have with your assistant, rather it should include the mechanisms you’ll include in your assistant to help your users contextualize the experience they’re engaging with.
For example, your interaction model should include:
- Scenario Ideations - Who are the personas who will be interacting with your conversation design and what may they need based on the context you’re creating with your conversational assistant
- Information Architecture - What is the right organization and flow to the information you want to show to each persona in each context to answer their question with the least amount of friction possible.
- Conceptual Models - What is the mental model of each persona to carry out how each action can be performed and how can you incorporate that into your conversational experience. For example, if your conversational assistant will be incorporating meeting booking or credit card transactions, how can you ensure that a scheduler or payment widget can be incorporated into the experience that mimics how a user would experience such a process on a website or in real life.
- Components - What will each persona experience that is consistent across all paths your conversational assistant can provide? What patterns will persist across all users experience interacting with your conversational assistant.
Creating a well thought out interaction model influences every other part of the conversation design experience, from adding utterances, intents, and entities into your NLU model to leverage as inputs with the right context to hand off to the dialog manager to contextualize and turn into assistant outputs and actions.
How Does The Interaction Model Interact With Other Conversational AI Components
After you’ve defined your assistant’s interaction model, conversation design teams can start to build the framework of your conversational assistant in a dialog system - usually a VUI - a voice user interface which allows users to interact with a system through voice or speech commands using speech recognition technology.
The interaction model of a conversational assistant works as the blueprint that then informs the NLU and the dialog manager of the assistant. Let’s break down how the interaction model you create before you start designing informs the entire conversational experience.
Interaction Models Impact NLU
In your Interaction Model, you want to define scenario ideations, information architecture, and conceptual models - all of the user’s potential thought processes, questions, and scenarios they could be coming to your assistant with the intent to solve.
With that in mind, you can start to craft the low fidelity NLU model you believe your conversational assistant should be able to handle when it’s launched.
With most projects, conversational AI and NLU models are an iterative process, based on live customer data and user feedback. Don’t be worried that your conversational assistant can’t understand every command on the first go. There’s an art and a science to building a bot that can fail gracefully as well.
💡 Pro Tip: When you’re designing the V1 of your conversational assistant, choose 3-5 commands you want your conversational assistant to perform well - as close to perfect as possible. That way you can build upon a solid foundation and don’t have to worry about building an assistant that can do everything on day 1.
The Makeup Of An NLU Model
NLU, or natural language understanding deals with machine reading, or reading comprehension. NLU goes beyond the sentence structure and aims to understand the intended meaning of language. While humans are able to effortlessly handle mispronunciations, swapped words, contractions, colloquialisms, and other quirks, machines are less adept at handling unpredictable inputs.
An NLU model is comprised of inputs that include utterances, intents, and entities. In the example below you can see how each input dictates what the machine understands.
In the “Show Weather” intent, you’ll have the “Show me today’s weather forecast in San Francisco” as an utterance for the machine to interpret. Based on the intent, the machine knows it needs to show the weather for a place and for a date. Based on the entities in the utterance being “today” and “San Francisco”, the machine now know’s to look at the weather report for today, in San Fransisco.
Building the NLU model is imperative to creating a smart and effective conversational assistant, but it isn’t the NLU that users will interact with on it’s own. The NLU needs to be built into a dialog manager to understand the contextual information derived from previous dialog turns.
Let’s dive into dialog management now.
Interaction Models Impact Dialog Management
When you’re mapping out scenarios for customer interactions in your interaction model, you’re really creating a low-fidelity version of your dialog manager.
Your dialog manager will house components - reusable paths and steps that could be triggered in multiple parts of a conversation like business logic, auto-responders, or “help” flows.
💡 Pro Tip: When you’re mapping out your conversational experience, start with the “happy path” first. This is the path that, in a perfect world, all of your users would take when interacting with your conversational assistant.
Designing for happy paths, and contingency plans for if a user does end up on an “unhappy path” can ensure that the 3-5 scenarios you build for as you start designing your conversational assistant are contextual with built in solutions to issues you can anticipate.
What Is Dialog Management
Dialog management is defined as a design system that offers a more flexible way to design customer-centric conversational experiences. This system involves writing more scripted dialogue between the AI assistant and the customer so that you can take those conversations and convert them into storyboards.
Dialog managers can be purely code based, or be no-code or low-code VUIs that provide a visual layer to map out your conversation design. Voiceflow is a dialog manager that allows you to design conversations while using your NLU model.
Dialog managers are the conversational connective tissue between the user inputs and the machine’s outputs to create a more natural conversational experience between your company and your users. You can have the most robust NLU environment you can dream of, but without a dialog manager to contextualize your NLU model, your conversational assistant won’t be very helpful, or facilitate appropriate conversations.
Historically, dialog management lived in spreadsheets or flowcharts meaning development teams had to start building a conversational experience from scratch, inferring conversational logic through keys of shapes and colors.
Today, dialog management tools are more intuitive, and can even be exported in a JSON format so that developers no longer have to hard code conversational experiences from scratch.
Let’s put it all together now.
Interaction Models Are Your Blueprint For Conversational Assistants
Before you put pen to paper, or cursor or VUI, building and defining an interaction model can set your design team on the right course to create a truly contextual, helpful, and contained conversational assistant.
The evolution of designing a conversation follows this timeline:
- User Research
- Build and define your interaction model.
- Data analyzation to build NLU model
- Dialog Management and Conversation Design.
Once you have a high fidelity prototype, you’re ready for user testing and development. But we’ll cover that another time.
Get started with Voiceflow, a free dialog management tool, here!