As of March 2023, we’ve reviewed 11 platforms and their announcements about how they have or are planning to incorporate LLMs into their chatbot platforms. As a note, we have excluded any use of LLMs in intent recognition to keep the focus on the more generative use cases.
Who is doing what?
Botpress
Botpress is an open-source chatbot platform headquartered in Quebec, which released Openbook in June 2022. OpenBook is a generative question answering feature allowing a business to upload documents and knowledge repositories which can then be queried in a conversational style.
According to their website, they have completely redesigned their platform to take advantage of large language models. They offer BotpressGo which allows you to create grounded question and answering functionality by simply adding a URL, GPT powered persona and workflows and making use of Codex to enable chatbot builders to design more complex dialog flows.
Boost.ai
Norwegian based Boost are also releasing a wide range of LLM based features to strengthen their chatbot offering. Interestingly, the features cover both their chatbot but also their live chat offering. In addition to generating intent training phrases automatically and re-writing response variations to match a specific tone of voice, Boost also offers features to aid your support agents including summarizing chats that get handed over from bot to human as well as providing an option to re-write the manually written responses or personalize canned responses for a support agent.
Kore.ai
Kore have one of the most comprehensive chatbot platforms around and have still managed to add in additional features utilizing LLMs including zero-shot intents, intents that don’t require any training phrases and improvements to regression testing. Furthermore, they have simplified the process of creating question and answer options through their Knowledge Graph by simply uploading a pdf.
Cognigy
Cognigy announced their LLM beta in early February and it immediately captured the attention of many. Features include intent training phrase and entity generation and response variation generation. The beta also promises rephrasing chatbot responses in real-time to take into account the customer utterance and their tone, as well as improvements to regression testing.
The final interesting feature revolves around being able to describe the flow that you want in natural language and Cognigy creating that flow in the flow designer, this is mirrored in code nodes where you can describe what you want a snippet of code to do and Cognigy will use a LLM to write the code for you, greatly enhancing the capabilities of the platform while still making it simple for anyone to use.
Chatlayer
Chatlayer is another platform with a feature focused on adapting the chatbot’s response to take into account the customer’s responses and tone of voice. In addition to this feature Chatlayer is using LLMs to handle fallbacks, so if the user utterance fails to successfully match an intent, Chatlayer will use a LLM to attempt to get the conversation back on track.
Haptik
Meanwhile, Haptik have also announced that they are integrating LLMs into their platform. As with many of the platforms mentioned above, Haptik are using LLMs to generate intent training phrases and generate variations on chatbot responses. They are also working on a generative question answering feature as well as an option to have the chatbot alter its responses according to the customer utterances.
Ada
Ada is another company that has looked into how LLMs can be used to generate training phrases for intents and they have an in-depth article here. They also announced an intention to build functionality to automatically build conversational flows and content from a customer knowledge base. Ada are also investigating how LLMs can be used to improve analysis and insights into the conversational data.
OpenDialog
OpenDialog is another chatbot platform that is planning on releasing LLM powered intent training and response variation. They are also looking at how LLMs can be used to identify scope for new intents as well as using LLMs to help test their chatbots.
Rasa
Rasa have introduced what they are calling Intentless Dialogue. The core of this approach appears to be an LLM that you train with Question Answer pairs, which when combined with their End-to-End policy, allows their model an even greater accuracy with minimal additional training. In this case it appears less to be about what the LLM itself can do or what it has been trained on, but how it combines with the rest of the Rasa platform that really delivers the value.
Teneo
Teneo have announced their OpenAI Connector to enable chatbot builders to connect their AI Assistants to a GPT model of their choice and from there use the functionalities within their chatbot.
Microsoft Power Virtual Agents
Finally and the most recent release on our list is Microsoft Power Virtual Agents who have released a very user friendly way to incorporate generative question answering into your chatbot. They call this feature “AI-based boosted conversations” and allows you to enter any web address through a simple interface to incorporate the information held there into your chatbot.
A summary of the features so far…
Looking at the platforms above, the features being offered fall into several groups and subgroups:
- Identifying potential new intents
- Generating responses “offline”
- Zero-shot or few-shot intent recognition
- Improving regression testing
- Generative question answering
- Generative “small talk”
- Natural language flow designers
- Generating response variations during “run time”
- Direct connections to an LLM
Final thoughts
So why is it important to understand what platforms are doing with LLMs? Well I think the main reason is that there are still a number of issues with LLMs that mean they are not yet suited to enterprise use cases just yet. Thanks to their in-depth understanding of the problems LLMs could eventually solve, Conversational AI platform providers are at the cutting edge of trying to solve these problems.
Having looked at the various proposed uses of LLM’s my first thoughts are that Rasa’s intentless approach and other platforms creating functionality to generate flows and empower conversation designers with natural language code nodes represent the most interesting options that have the most potential to transform chatbot creation.
While many of the other LLM powered features that are now finding their way into chatbot platforms such as boosting training phrases or response writing represent the low-hanging fruit, they can still reduce the time it takes to create a chatbot and assist in making the conversation more adaptable to a customer’s current mood and communication style but ultimately create fewer new possibilities for teams building chatbots.
As there is so much similarity between what each platform is implementing it suggests that most platform providers are taking a cautious approach to LLMs and still trying to find their feet and we will continue to watch this space to see how the platforms change and develop their LLM offerings.