As of May 1st 2023, we’ve reviewed 14 platforms and their announcements about how they have or are planning to incorporate LLMs into their chatbot platforms. As a note, we have excluded any use of LLMs in intent recognition to keep the focus on the more generative use cases.
Who is doing what?
Ada has looked into how LLMs can be used to generate training phrases for intents and they have an in-depth article here. They also announced they want to build a functionality to automatically build conversational flows and content from a customer knowledge base. Ada are also exploring how LLMs can be used to improve analysis and insights into the conversational data.
Norwegian-based Boost is releasing a wide range of LLM-powered features to strengthen their chatbot offering. Interestingly, these features not only cover their chatbot but also their live chat offering.
In addition to generating intent training phrases automatically and re-writing response variations to match a specific tone of voice, Boost offers features to aid your support agents, including summarizing chats that get handed over from bot to human as well as providing an option to re-write the manually written responses or personalize canned responses for a support agent.
Botpress have now launched their redesigned platform with several interesting features. One feature covers chatbot personality that can be defined and set with a single prompt. On top of that, there are several features designed to make building chatbots easier through the use of a code generator and a feature that can design transitions between nodes with plain text.
Most innovative is what Botpress calls the ‘AI task cards’, designed to make it easy to use a LLM within a dialog flow during runtime. The AI task card is a structured way for a user to input prompts, inputting stored variables, and defining how outputs should be stored. The goal here it would seem is to make it simpler to utilize LLMs for more complex custom functions. To build on that further you can chain the AI task cards together to achieve much more complex tasks.
Finally, Botpress also has BotpressGO which allows you to provide a URL for your knowledge base and create a question answering functionality instantly.
Certainly are a chatbot platform who have a strong focus on e-commerce, and have a connection to OpenAI within their available integrations. The connection is designed for flexibility and allows Certainly to deliver generative AI powered journeys that also make use of content such as items from the merchant’s product catalog. Another feature to note is the attention paid to controlling the outputs of GPT models through introducing topic and keyword blacklists into the user user interface.
Chatlayer is another platform with a feature focused on adapting the chatbot’s response to take into account the customer’s responses and tone of voice. In addition to this feature, Chatlayer is using LLMs to handle fallbacks, so if the user utterance fails to successfully match an intent, Chatlayer will use a LLM to attempt to get the conversation back on track.
Cognigy announced their LLM beta in early February and it immediately captured the attention of many. Features include intent training phrase and entity generation and response variation generation. The beta also promises rephrasing chatbot responses in real-time to take into account the customer utterance and their tone, as well as improvements to regression testing.
The final interesting feature revolves around being able to describe the flow that you want in natural language and Cognigy creating that flow in the flow designer, this is mirrored in code nodes where you can describe what you want a snippet of code to do and Cognigy will use a LLM to write the code for you, greatly enhancing the capabilities of the platform while still making it simple for anyone to use.
Meanwhile, Haptik has also announced that they are integrating LLMs into their platform. As with many of the platforms mentioned above, Haptik are using LLMs to generate intent training phrases and generate variations on chatbot responses. They are also working on a grounded question answering feature as well as an option to have the chatbot alter its responses according to the customer utterances.
Kore have one of the most comprehensive chatbot platforms around and have still managed to add in additional features utilizing LLMs including zero-shot intents, intents that don’t require any training phrases, and improvements to regression testing. Furthermore, they have simplified the process of creating question and answer options through their Knowledge Graph by simply uploading a pdf.
More recently, Kore have released a feature that promises to more or less build a complete chatbot based on a description of the chatbot’s purpose: Kore’s Automatic Dialog Generation Dialog Tasks for Conversation Design, Logic Building & Training. The most recent release also includes “Dynamic Paraphrasing” allowing responses to be more contextually relevant, but we haven’t been able to test (yet).
Microsoft Power Virtual Agents
Microsoft Power Virtual Agents have released a very user friendly way to incorporate grounded question answering into your chatbot. They call this feature “AI-based boosted conversations” and it allows you to enter any web address through a simple interface to incorporate the information held there into your chatbot.
Another announcement mentioned a future release of Power Virtual Agents Copilot, which aims to assist in the building of chatbots by allowing users to use natural language to describe what they want their bot to do. We suggest reading the blog article to learn more about the features. It seems you can test them, but currently in the US and exclusively in the English language.
OpenDialog is another chatbot platform that is planning on releasing LLM-powered intent training and response variation. They are also looking at how LLMs can be used to identify scope for new intents as well as using LLMs to help test their chatbots.
Rasa have introduced what they are calling Intentless Dialogue. The core of this approach appears to be an LLM that you train with Question Answer pairs, which when combined with their End-to-End policy, allows their model an even greater accuracy with minimal additional training.
In this case, it appears less to be about what the LLM itself can do or what it has been trained on, but how it combines with the rest of the Rasa platform that really delivers the value.
Teneo have announced their OpenAI Connector to enable chatbot builders to connect your AI assistants to a GPT-model of your choice and from there use the functionalities within your chatbot. These include: response variations, using GPT as a fallback, and summarization of conversations for human handover. Finally, one of the more novel solutions Teneo is offering, is using GPT for sentiment analysis of conversations.
Ultimate have also launched a grounded question answering feature they are calling UltimateGPT which will connect through their Zendesk, Salesforce and Freshdesk integrations. Ultimate are also expecting to release a range of LLM-powered features to increase the efficiency of the chatbot building process.
Voiceflow has gone all in with LLMs and are putting them at the core of their product with the goal of making it really easy for anyone to build with LLMs. And honestly, it works very intuitively.
Features include training phrase generation, generating multiple versions for the same response, and generative fallback handling. They’ve also shipped two more advanced features that allow you to mess around and experiment with summarization, classification, categorization, grounded question and answering, and more. This means you can now test and prototype LLM-powered solutions incredibly quickly, without developer experience.
A summary of the features so far…
Looking at the platforms above, the features being offered fall into several groups and subgroups:
- Identifying potential new intents
- Generating responses “offline”
- Zero-shot or few-shot intent recognition
- Improving regression testing
- Generative/grounded question answering
- Generative “small talk”
- Natural language flow designers
- Sentiment analysis
- Generating response variations during “run time”
- Direct connections to an LLM and supporting interfaces
So why is it important to understand what platforms are doing with LLMs? Well I think the main reason is that there are still a number of issues with LLMs that mean they are not yet suited to enterprise use cases just yet. Thanks to their in-depth understanding of the problems LLMs could eventually solve, Conversational AI platform providers are at the cutting edge of trying to solve these problems.
Having looked at the various proposed uses of LLM’s my first thoughts are that Rasa’s intentless approach and other platforms creating functionality to generate flows and empower conversation designers with natural language code nodes represent the most interesting options that have the most potential to transform chatbot creation.
While many of the other LLM powered features that are now finding their way into chatbot platforms such as boosting training phrases or response writing represent the low-hanging fruit, they can still reduce the time it takes to create a chatbot and assist in making the conversation more adaptable to a customer’s current mood and communication style but ultimately create fewer new possibilities for teams building chatbots.
As there is so much similarity between what each platform is implementing it suggests that most platform providers are taking a cautious approach to LLMs and still trying to find their feet. Of course, we’ll continue to watch this space to see how the platforms change and develop their LLM offerings.