Implementing an LLM Agent with Tool Access Using MCP-Use
MCP-Use is an open-source library that connects any large language model (LLM) to any MCP server. This integration allows your agents to access tools like web browsing and file operations without relying on proprietary clients. This guide will demonstrate how to create a simple chatbot using MCP-Use and LangChain’s Groq, utilizing built-in conversation memory for enhanced interactions.
Step 1: Setting Up the Environment
Installing the UV Package Manager
To begin, we need to install the UV package manager. Here are the commands based on your operating system:
- Mac or Linux: Run the command:
curl -LsSf https://astral.sh/uv/install.sh | sh
- Windows: Use PowerShell to execute:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
Creating a New Project Directory
Next, create a new project directory and initialize it:
uv init mcp-use-demo
Then navigate to the directory:
cd mcp-use-demo
Activating a Virtual Environment
Activate a virtual environment using the following commands:
- Mac or Linux:
uv venv
followed bysource .venv/bin/activate
- Windows:
uv venv
followed by.venv\Scripts\activate
Installing Python Dependencies
Install the required dependencies with the command:
uv add mcp-use langchain-groq python-dotenv
Step 2: Setting Up the Environment Variables
Groq API Key
To access Groq’s LLMs, generate an API key from the Groq Console. Create a .env file in your project directory and include the following line:
GROQ_API_KEY=
Brave Search API Key
This tutorial uses the Brave Search MCP Server. Obtain your Brave Search API key from the Brave Search API website. Create a file named mcp.json
in the project root with the following configuration:
{ "mcpServers": { "brave-search": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-brave-search" ], "env": { "BRAVE_API_KEY": "" } } } }
Replace
with your actual Brave API key.
Node.js Installation
Some MCP servers, including Brave Search, require Node.js. Download and install the latest version from nodejs.org, keeping all settings as default during installation.
Using Other MCP Servers
If you wish to use a different MCP server, simply modify the contents of mcp.json
with the appropriate configuration for that server.
Step 3: Implementing the Chatbot
Creating the App File
Create an app.py
file in your directory and add the following content:
from dotenv import load_dotenv
from langchain_groq import ChatGroq
from mcp_use import MCPAgent, MCPClient
import os
import sys
import warnings
warnings.filterwarnings("ignore", category=ResourceWarning)
Setting Up the Chatbot
The following code initializes the chatbot:
async def run_chatbot():
load_dotenv()
os.environ["GROQ_API_KEY"] = os.getenv("GROQ_API_KEY")
configFile = "mcp.json"
print("Starting chatbot...")
client = MCPClient.from_config_file(configFile)
llm = ChatGroq(model="llama-3.1-8b-instant")
agent = MCPAgent(llm=llm, client=client, max_steps=15, memory_enabled=True, verbose=False)
Implementing Interactive Chatting
Enhance user interaction with the following code:
print("-----Interactive MCP Chat-----")
print("Type 'exit' or 'quit' to end the conversation")
print("Type 'clear' to clear conversation history")
try:
while True:
user_input = input("You: ")
if user_input.lower() in ["exit", "quit"]:
print("Ending conversation....")
break
if user_input.lower() == "clear":
agent.clear_conversation_history()
print("Conversation history cleared....")
continue
print("Assistant: ", end="", flush=True)
response = await agent.run(user_input)
print(response)
finally:
if client and client.sessions:
await client.close_all_sessions()
Running the Application
To run the application, use the command:
uv run app.py
This command will start the chatbot, allowing you to interact with it and utilize the server.
Conclusion
In summary, integrating MCP-Use with a chatbot enables seamless access to various tools, enhancing user interaction and automating tasks. By following the outlined steps, businesses can leverage AI technology to improve efficiency and customer engagement. Start small, measure results, and gradually expand your AI initiatives to maximize impact.
If you require assistance in managing AI in your business, please reach out to us at hello@itinai.ru or connect with us on Telegram, X, and LinkedIn.