This blog post was first published on February 2, 2023.
This post overviews the adoption and business applications of Generative AI (GAI) and Large Language Models (LLMs), such as ChatGPT, in the market as of February 2023. It will also cover how we are working with them at Certainly. Our ideas will evolve, and we will have new ones as we learn. Consider these a starting point.
You can find a primer on Generative AI, Large Language Models, and the differences between them and Certainly here.
Predictions on adoption
The natural language user experience of ChatGPT – that it simulates an instant messenger – was crucial to its viral success. Its predecessor, GPT playground, did not achieve the same fast adoption due to its API interface.
The natural language interface was key to its success as it allowed the masses to understand the power of language AI. This shows that, while the underlying technology is the same, the most successful AI products will be those obsessed with the user experience.
In the wake of ChatGPT’s success, we will see an explosion in new products using Generative AI and LLMs to solve problems for businesses and consumers. These LLMs will improve massively and become standard infrastructure in many B2C and B2B products.
The applications of these seem to follow this progression:
Wave one: immediate business applications for ChatGPT
The first wave has been focused on content generation. It is relatively easy to take language models and finetune them only to spit out content for specific use cases. Lensa AI, for example, is a consumer-focused app where you can upload headshots of yourself. Then, using GAI, the app will spit out new AI-generated profile pictures of you. Other examples are Jasper.ai, which helps marketers generate texts, and Copilot, which allows developers to generate code. This breed of products is essentially a finetuned LLM with a custom interface on top.
These tools serve a function for the user. However, they’re not commercially defendable as others can quickly build a similar product. That is what is happening; many competitors are launching, and when the products are identical, it becomes a race to capture the most customers with an easy-to-copy product. At the same time, you figure out how to monetize your customer base and build a better moat.
After last week’s announcement that Microsoft will invest as much as $10 billion in OpenAI over the coming years, adding ChatGPT-like functionality to their business solutions seems more of a certainty. Github’s Copilot has been helping coders since 2021, and just yesterday, the tech giant rolled out Teams Premium, which uses the technology to streamline meetings. The next logical conclusion is that they add Generative AI tools to the 365 Suite.
What is GREAT, though, is that these products help accelerate AI adoption and awareness in both businesses and the public.
Wave two: a replacement to search?
The next wave will be information retrieval. For example, custom search engines using a natural language interface that doesn’t just return a headline and a link to a webpage like Google does today. Rather, it’ll generate an actual answer to your query.
All indicators say that Bing will attempt a comeback in 2023. After Microsoft’s above-mentioned investment in OpenAI, a ChatGPT-integrated version of Bing (remember Bing?) is rumored to be on the horizon. This new breed of search engines could be a potential Google competitor.
Microsoft establishing Bing as a real competitor to Google will be the most significant disruption in the industry. Microsoft’s investment represents a multi-year partnership to not only potentially redefine search but also bring OpenAI’s capabilities to the enterprise and build countless applications on top. Interestingly enough, Microsoft does not have to earn revenue from Bing. It is enough for them to steal market share from Google, which will hurt Google a lot since the vast majority of revenue comes from their search and ads business. The King is dead. Long live the Bing?
Natural language search engines will also be used in narrower use cases. You can, for example, finetune an LLM on the transcript of your 1,000 podcast episodes and let users search for relevant information within that corpus of data.
These are great use cases but relatively straightforward if it is nothing more than an easier-to-use search engine. Search engines are essentially single-turn Q&A conversations that chatbots can also do, bringing me to the third wave, where I see Certainly fits in.
Wave three: Actionable AI
The third wave is what we do. In this context, I will call it “Actionable AI”; that is to say, an AI system that performs actions on your behalf based on what you tell it to do. The high-level tech stack needed is a natural language user interface, LLMs, and technology that can control other systems and send/receive data.
The Actionable AI tech stack has a lot of opportunities as it will allow non-coders like me to, for example, build software applications and connect third-party systems without knowing how to code.
Actionable AI is especially well-suited for digital commerce. It enables consumers to shop from brands using their natural language and for brands to provide a more human-like experience.
Now, the chatbot can do the shopping for you as a consumer. It will understand your purchase needs and, based on that, take actions such as navigating to the right product, taking you through check-out, and returning a previous purchase.
Actionable AI products are the hardest to build and monetize compared to the first waves. This means, though, that they are equally hard to copy. The main reasons are:
- Building a platform that enables all the different layers to work together – the natural language interface, the AI models, the data exchange layer, and the action layer – requires a massive investment.
- Their jobs are to handle conversations between businesses and their customers and resolve use cases where there is no gray area between success and failure: it either helps me with what I need or doesn’t.
At Certainly, we have been bullish on the opportunities in this space from day one. We’ve spent years building out Actionable AI tech specifically for ecommerce businesses. Today we have happy customers in more than 20 countries and have handled more than half a billion interactions between brands and consumers since inception.
ChatGPT’s business opportunities for the Conversational AI industry
The most significant improvement is that LLMs will help make the most annoying thing about chatbots disappear; their inability to understand what you want. No more “Sorry, I am a dumb chatbot. I didn’t understand you; please try again”.
They’ll help chatbots be much better at understanding the user’s intent and will make it much easier for businesses to build useful and loveable bots. Bots that excel, not only in single-step conversations like providing an answer to a question but also in taking part in a multi-step conversation, for example, to help find the right product.
The improvement in the utility and increase in general awareness will lead to a rapid rise in chatbot usage for businesses and consumers across many industries and use cases. It will increase consumers’ desire to interact with chatbots, increasing the ROI businesses get from using the technology. It is a positive cycle leading to faster adoption of chatbots. This is an ample opportunity for us and everyone thinking about using chatbots in their business.
Incorporating LLMs into our product will accelerate what we already focus on
The recent progress in language technology, including the release of ChatGPT, has brought us closer to our vision of enabling any merchant to use a human-like digital sales assistant to help their customers. In fact, Large Language Models have accelerated our efforts and focus.
We can benefit from using language models from companies like OpenAI as part of our infrastructure rather than building all the generic models in-house. This shift is similar to the move from on-premise to cloud computing, where companies no longer had to spend resources on managing their servers but could focus on building products that solved their customers’ problems. By outsourcing the generic language model infrastructure, we can focus on the application layer and create valuable products for our customer segments. This will ultimately benefit our customers and make good business sense for us.
At Certainly, we are experimenting with LLMs and adding them in ways that are useful in end-user-conversations, our customers’ workflow, and our internal work. We want them to do one of two things:
1. Improve on something we/our customers do today.
2. Enable us/customers to do something new.
In our product, this means we’re experimenting with:
- Enhancing the human-like conversations by enabling natural language generation, confined by and based on the individual brand’s content and ethos
- Improving bot-building workflows via generating content for answers, intents, and entities, even simulating entire conversational flows
- Simulating a user journey: Showing customers the quality of their flow in the bot builder and pointing out opportunities to improve
- Automating the testing of complex bots to avoid configuration mishaps
- Model finetuning on industry-specific data and customers’ products/faqs so bots can automatically answer relevant questions and drive conversations around purchases within minutes
- Prompting the Certainly Supportbot to update bots based on customers’ instructions
- Provide OpenAI webhook templates for customers to freely experiment with and incorporate into their bots
And in our day-to-day work:
- Assisting QA with test cases before releasing new features
- Automating the generation of standard code
- Creation and clean-up of datasets
- Assisting in content generation for helpcenter articles and documentation
ChatGPT business opportunities for Certainly
Incorporating LLMs as part of our infrastructure enables us to do several things to accelerate growth:
- Getting the time-to-live for customers from weeks down to minutes
- Lower the barrier of entry for merchants to adopt useful chatbot technology
- Reduce customer acquisition costs and cost to serve
- Expand our addressable market
- Solve the challenge for bots for retailers with many types of products
- Provide template bot solutions for partners
- Focus our Product team on building differentiation instead of infrastructure
Using LLMs to accelerate the naturalness of conversations, including understanding multi-intents and entities and remembering what was said previously in the conversation, helps improve the end-user experience and the ROI for merchants. This means merchants will scale their usage of Certainly bots faster. The improved UX means more end-users want to chat with bots. Better UX and business ROI means more adoption of bots by businesses and merchants, which expands the addressable market. Faster time to live through specialization means lower CAC and cost to serve, which means our addressable market increases significantly.
And that’s the dream. Ultimately, what we want to do is provide a service that will unlock more time for you to do the things that give you energy and that you love.