Improve LLM responses in RAG use cases by interacting with the user

Generative AI and large language models (LLMs) are often used for question answering systems based on external knowledge. Traditional systems struggle with vague or ambiguous questions without context. To address this, an interactive clarification component using LangChain is introduced, allowing a conversational dialogue with users to gather context and provide accurate answers. The solution is demonstrated using Amazon Kendra, Amazon Bedrock LLM, and Streamlit.

 Improve LLM responses in RAG use cases by interacting with the user

Improve LLM Responses in RAG Use Cases with Interactive User Interaction

One common application of generative AI and large language models (LLMs) is answering questions based on external knowledge. However, traditional systems often struggle with vague or ambiguous questions, leading to unhelpful or incorrect responses. In this post, we introduce a solution to enhance the quality of answers in such cases by incorporating interactive clarification using LangChain.

Solution Overview

To demonstrate the solution, we set up an Amazon Kendra index, a LangChain agent with an Amazon Bedrock LLM, and a Streamlit user interface.

Prerequisites

To run this demo, complete the following prerequisites:

  • Clone the GitHub repository and follow the steps in the README.
  • Deploy an Amazon Kendra index in your AWS account.
  • Set up the LangChain agent with the necessary foundations.
  • Use Amazon SageMaker Studio to run the Streamlit app.

Implement the Solution

Traditional RAG agents retrieve relevant documents and provide answers based on the retrieved context. In this solution, we enhance the agent by adding a custom tool called AskHumanTool. This tool allows the agent to ask the user for clarification when the initial question is unclear.

By incorporating this interactive dialogue, the agent can gather the necessary context to provide accurate and helpful answers, even with ambiguous queries.

Example Workflow

Here is an example workflow:

  1. The user asks a question: “How many GPUs does my EC2 instance have?”
  2. The agent uses the LLM to decide the next action.
  3. The agent retrieves information from the Amazon Kendra index.
  4. If the retrieved context is insufficient, the agent uses the AskHumanTool to ask the user for clarification.
  5. Once the user provides the necessary information, the agent incorporates it and retrieves the correct answer.

This interactive approach improves the reliability and accuracy of responses, leading to a better customer experience in various RAG applications.

Clean Up

To avoid unnecessary costs, delete the Amazon Kendra index and shut down the SageMaker Studio instance if not in use.

Conclusion

By adding interactive user interaction to RAG systems, we can enhance the customer experience and deliver more satisfactory answers. This approach can be applied to various generative AI use cases, not just RAG. To learn more about using Amazon Kendra with generative AI, refer to our resources.

For more information on AI solutions and how they can transform your company, contact us at hello@itinai.com or visit our website.

Spotlight on a Practical AI Solution: AI Sales Bot

Discover how our AI Sales Bot can automate customer engagement and manage interactions across all stages of the customer journey. Visit itinai.com/aisalesbot to redefine your sales processes and customer engagement.

List of Useful Links:

AI Products for Business or Try Custom Development

AI Sales Bot

Welcome AI Sales Bot, your 24/7 teammate! Engaging customers in natural language across all channels and learning from your materials, it’s a step towards efficient, enriched customer interactions and sales

AI Document Assistant

Unlock insights and drive decisions with our AI Insights Suite. Indexing your documents and data, it provides smart, AI-driven decision support, enhancing your productivity and decision-making.

AI Customer Support

Upgrade your support with our AI Assistant, reducing response times and personalizing interactions by analyzing documents and past engagements. Boost your team and customer satisfaction

AI Scrum Bot

Enhance agile management with our AI Scrum Bot, it helps to organize retrospectives. It answers queries and boosts collaboration and efficiency in your scrum processes.