Today's customer service organizations face a tremendous opportunity. As customer expectations rise, brands have the chance to creatively apply new innovations to transform the customer experience. Meeting increasing customer demands is a challenge, but the latest breakthroughs in conversational artificial intelligence (AI) are helping companies meet these expectations.
Customers today expect timely answers to their questions that are helpful, accurate, and tailored to their needs. The new QnAIntent, powered by Amazon Bedrock, helps meet these expectations by understanding questions posed in natural language and providing conversational responses in real time using its own authorized knowledge sources. Our Retrieval Augmented Generation (RAG) approach enables Amazon Lex to leverage both the breadth of knowledge available in repositories and the fluency of large language models (LLMs).
Amazon Bedrock is a fully managed service that offers a choice of high-performance foundational models (FMs) from leading AI companies such as AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API and also provides a wide range of capabilities for building generative AI applications with security, privacy, and responsible AI.
In this post, we'll show you how to add generative AI question answering capabilities to your bot, using your own curated knowledge sources, without writing a single line of code.
Read on to find out how QnAIntent can transform your customer experience.
Solution overview
Implementing the solution involves the following high-level steps:
- Create an Amazon Lex bot.
- Create an Amazon Simple Storage Service (Amazon S3) bucket and upload the PDF files that contain the information you will use to answer the questions.
- We use the Amazon Titan Embeddings model to create knowledge bases that split the data into chunks and generate embeddings. As part of this process, Amazon Bedrock Knowledge Bases automatically creates Amazon OpenSearch Serverless vector search collections to hold the vectorized data.
- Add a new QnAIntent intent that uses a knowledge base to find answers to customer questions and uses the Anthropic Claude model to generate answers to questions and follow-up questions.
Prerequisites
To perform the functionality described in this post, you need access to an AWS account with permissions to access Amazon Lex, Amazon Bedrock (including access to the Anthropic Claude model and Amazon Titan embedding or Cohere Embed), an Amazon Bedrock knowledge base, and the OpenSearch Serverless vector engine. To request access to the Amazon Bedrock models, complete the following steps:
- On the Amazon Bedrock console, Model Access In the navigation pane.

- choose Managing Model Access.
- choose Amazon and Anthropological Model. (You can also use a Cohere model for embedding.)



- choose Request model access.
Create an Amazon Lex Bot
If you already have a bot you want to use, you can skip this step.
- In the Amazon Lex console, Bots In the navigation pane.
- choose Create a bot

- select Let's start with an example Select the BookTrip sample bot.

- for Bot Nameenter a name for your bot (for example, BookHotel).
- for Runtime Rolesselect Create a role with basic Amazon Lex permissions.
- In Children's Online Privacy Protection Act (COPPA) In the section, no This bot is not intended for children under 13 years of age.

- Please keep it Idle Session Timeout Set it to 5 minutes.
- choose Next.

- If you use QnAIntent to answer questions with your bot, we recommend increasing the confidence threshold for intent classification to prevent a question from being mistakenly interpreted as matching one of your intents. Currently, we set this to 0.8. You may need to adjust this up or down based on your own testing.
- choose end.

- choose Intention of preservation.
Uploading content to Amazon S3
Now, create an S3 bucket to store the documents you will use for your knowledge base.
- In the Amazon S3 console, bucket In the navigation pane.
- choose Create a bucket.
- for Bucket NameEnter a unique name.

- Leave all other options at their default values. Create a bucket.

In this post, we created a FAQ document for a fictional hotel chain called FictitiousHotels by Example Corp. Download the PDF document and follow along.
- upper bucket On the page, navigate to the bucket that you created.
If you don't see it, you can search for it by name.

- choose upload.

- choose Additional files.
- please select
ExampleCorpFicticiousHotelsFAQ.pdfSomething downloaded. - choose upload.

You should now be able to access the files in your S3 bucket.
Create a knowledge base
Now you can set up your knowledge base.
- On the Amazon Bedrock console, Knowledge Base In the navigation pane.

- choose Create a knowledge base.

- for Knowledge Base NameEnter your name.
- for Knowledge Base Explanationand enter an optional description.

- select Create and use a new service role.
- for Service Role NameEnter a name or leave the default.

- choose Next.
- for Data Source Name,Enter your name.
- choose Browse S3 Go to the S3 bucket where you previously uploaded the PDF file.
- choose Next.

- Select the embedded model.

- select Quickly create a new vector store Create a new OpenSearch Serverless vector store to store vectorized content.
- choose Next.

- Please check your settings before selecting Create a knowledge base.
After a few minutes, your knowledge base will be created.
- choose Synchronization We sync documents into chunks, compute embeddings, and save them in a vector store.
This may take a while, you can continue with the remaining steps, but the synchronization must complete before you can query the knowledge base.
- Copy the knowledge base ID, you will refer to it when you add this knowledge base to your Amazon Lex bot.

Adding QnAIntent to an Amazon Lex bot
To add a QnAIntent, follow these steps:
- In the Amazon Lex console, Bots In the navigation pane.
- Select your bot.

- In the navigation panel, click intention.

- upper Add an intent Menu, Select Use built-in intents.

- for Built-in Intentchoose AMAZON.QnAIntent.
- for Intent Name,Enter your name.
- choose addition.

- Select the model you want to use to generate your answer (in this case, Anthropic Claude 3 Sonnet, but you could also choose Anthropic Claude 3 Haiku as a lower latency, cheaper option).
- for Select Knowledge Storeselect Amazon Bedrock Knowledge Base.
- for Amazon Bedrock ID Knowledge BaseEnter the ID you noted when you created the knowledge base.
- choose Save Intent.

- choose build Build the bot.
- choose test Test your new intentions.
The following screenshot shows an example of a conversation with a bot.

The second question about pool hours in Miami can reference the previous question about pool hours in Las Vegas and get the right answer based on the conversation history.
You can also ask questions that require the bot to do a bit of reasoning based on the data available: When I asked about resorts that are good for families, the bot recommended resorts in Orlando based on the availability of kid-friendly activities, proximity to theme parks, etc.
Update the confidence threshold
Some questions may be incorrectly matched with other intents. If this happens, you can adjust your bot's confidence threshold. To change this setting, select your bot's language (English), Language details Select by section edit.
After updating the confidence threshold, rebuild your bot for the changes to take effect.

Adding additional steps
By default, the next steps in your bot's conversation are set to: Wait for user input After the question is answered, the conversation remains within the bot, allowing the user to ask follow-up questions or invoke other intents within the bot.
If you want to end the conversation and return control to the calling application (such as Amazon Connect), you can change this behavior as follows: End a conversationTo update your settings, follow these steps:
- In the Amazon Lex console, navigate to QnAIntent.
- In fulfillment Select by section Advanced options.

- upper Next steps in the conversation Select from the drop-down menu End a conversation.

If you want your bot to add a specific message after each response from a QnAIntent (such as “Is there anything else I can help you with?”), you can add a closing response to the QnAIntent.
cleaning
To avoid incurring ongoing costs, delete the resources you created as part of this post.
- Amazon Lex Bot
- S3 bucket
- OpenSearch Serverless collections (not automatically deleted when you delete the knowledge base)
- Knowledge Base
Conclusion
The new QnAIntent in Amazon Lex enables natural conversations by connecting customers with curated knowledge sources. Powered by Amazon Bedrock, QnAIntent understands natural language questions, responds conversationally, and keeps customers engaged with contextual follow-up responses.
QnAIntent leverages the latest innovations to transform static FAQs into fluid conversations that resolve customer needs, allowing you to scale great self-service to delight your customers.
Try it for yourself and reinvent the customer experience!
About the Author
Thomas Linfas He is a Senior Solutions Architect on the Amazon Lex team, where he invents, develops, prototypes, and evangelizes new technical features and solutions for the Language AI service that improve customer experience and ease adoption.
