This strategy began to show cracks 5 years ago as organizations began to form their own in-house teams to build and manage conversational AI. This led to the rise of conversational AI product teams, and titles like “Conversation Designer”. As these in-house teams grew, specialized platforms like Voiceflow emerged initially to just fill the UX gap left by the large CAI platforms. Companies bought Voiceflow’s conversation design platform for their in-house teams to be able to build in a usable tool, then export to their chosen CAI platform to leverage its proprietary technology (often NLU).
The launch of ChatGPT broke the existing CAI platform model as proprietary technologies which had been the backbone of the CAI platform strategy now faced the threat of commoditization. Every platform and customer now had access to the same cutting edge technology.
There is no longer sustainable differentiation and defensibility at the technology layer for most CAI platforms. What matters now for CAI platforms is the UX when creating agents, and the flexibility and scope of integrations. After all, if every platform is now powered by the same technology, the only differentiation that remains is how easy it is to actually achieve your desired outcome.
This new reality places CAI platforms at a crossroads. If a platform doesn’t have the capital to become a Large Language Model (LLM) provider themselves or build new technology differentiation, they have two options to be competitive moving forward:
- Transition to a use case focused point solution (such as customer support). Why? Platforms are toolkits to build anything, and point solutions are tools to solve particular problems. By focusing on a specific problem, like customer support, point solutions can fine tune models and build features specific to the problem that allow for enhanced differentiation.
- Transition to a new differentiated value proposition.
Why? Platforms that were built with a focus on proprietary NLU models need to refocus on a new differentiating factor. The edge could be anything that brands value as a differentiator, such as having the best user experience, the most secure platform, or potentially the widest integration library.
Some predictions for changes to the CAI model
As the industry adapts to the changes introduced by in-house teams and LLMs, I expect we’ll see a few other big changes:
- Adoption of Product-Led-Growth (PLG). Why? Sales-Led-Growth (SLG) requires large contract values to pay for the sales people and support. As contract values shrink due to lesser differentiation through technology and the erosion of pricing power, CAI platforms will look for cheaper ways to acquire customers, such as PLG.
- Convergence of CAI platforms with Integration & RPA platforms. Why? Two years ago, automation workflows and language understanding in CAI platforms were tightly coupled. When building a bot, you were simultaneously teaching the bot language understanding as well as creating automation logic. Today, automation logic and language understanding have been decoupled thanks to LLMs. Traditional integration platforms like Zapier can now build chatbots because language understanding is as easy as an API call. Conversely, you can now build automation and integration workflows on Voiceflow that don’t have any conversational components. It’s still early for this convergence, but I suspect in the coming years we see a broader category of “AI Automation” emerge. I wrote about a similar trend with the potential for an Automation Designer (AxD) title last week.
- Wider adoption of Usage Based Pricing. Why? Usage Based Pricing charges customers only for value created. It’s been common in the CAI industry for customers to buy a vendor only to never achieve value because it was too difficult to actually get a bot to production. As platforms become easier to use, and faster to launch bots, I expect we’ll see platforms price more on value actually attained (usage based pricing). We’re moving further towards this model at Voiceflow.
- Use case expansion outside of customer support. Why? AI Agents have historically been so expensive to build that only use cases with high volume and proven ROI were economically viable to build for (like customer support). As LLMs have reduced the cost of building bots by an order of magnitude, I expect we’ll see a correlated explosion of use cases. We’re already seeing this at Voiceflow as brands are now building bots for use cases that are low-mid volume that wouldn’t have merited the expense of an automation build-out before.
- In-house CAI teams will get larger. Why? This is directly correlated to use case expansion. As the number of ROI viable use cases for AI automation grow, so will the needs for a centralized team to build and manage them.
I’m incredibly excited for the next few years of our industry. While some platforms may fade under the changing reality, I believe many will thrive within a growing market opportunity.
For Voiceflow, the advances in LLMs have unlocked us from being just focusing on the creation layer with industry-leading UX, to now competing with the underlying CAI platforms themselves as the technology is now accessible to everyone. Our goal is to build the industry’s leading conversational AI platform and free customers from having to choose between having a usable platform, and powerful technology.
Exciting times ahead!