
What is mcp-client-langchain-py
mcp-client-langchain-py is a user-friendly client designed to demonstrate the integration of Model Context Protocol (MCP) server tools using the LangChain ReAct Agent. Developed by Hideya, this client facilitates the seamless interaction between AI models and various data sources through MCP, a protocol that allows AI systems to securely connect with external tools and information. The client leverages the utility function convert_mcp_to_langchain_tools()
to convert MCP server tools into LangChain-compatible tools, supporting AI models from Anthropic, OpenAI, and Groq.
How to Use mcp-client-langchain-py
To get started with mcp-client-langchain-py, ensure you have Python 3.11 or higher and, optionally, uv
or npm
if you plan to run MCP servers through Python or Node.js packages, respectively. You'll also need API keys from Anthropic, OpenAI, or Groq.
Setup Steps:
-
Install Dependencies: Open your terminal and run:
make install
-
Configure API Keys: Copy the template file and update it with your API keys:
cp .env.template .env
Ensure your
.env
file contains all necessary credentials, protected from accidental commits by.gitignore
. -
Adjust Configuration: Modify the
llm_mcp_config.json5
file to set up your LLM and MCP server settings. This JSON5 format allows comments and trailing commas, and supports environment variable interpolation with${...}
notation.
Running the Client:
-
Start the application:
make start
Note that the initial run may take some time.
-
To run in verbose mode:
make start-v
-
For help with command-line options:
make start-h
Once running, you can press Enter at the prompt to use example queries for MCP server tool invocations. These example queries can be customized within the llm_mcp_config.json5
file.
Key Features of mcp-client-langchain-py
-
Seamless Integration with MCP: Utilizes the Model Context Protocol to connect AI systems with external data sources securely and efficiently.
-
LangChain Compatibility: Converts MCP server tools into LangChain-compatible tools, enabling robust AI interactions.
-
Support for Multiple LLMs: Works with language models from Anthropic, OpenAI, and Groq, offering flexibility and choice.
-
Customizable Configuration: Offers a flexible configuration setup using JSON5, allowing comments and environment variable references for secure and adaptable setups.
-
Ease of Use: Simple setup and operation with command-line interface options for starting, running in verbose mode, and configuring queries.
This client is ideal for users looking to bridge AI models with diverse data sources, leveraging the power of MCP with the flexibility of LangChain tools.
How to Use
To use the mcp-client-langchain-py, follow these steps:
- Visit https://github.com/hideya/mcp-client-langchain-pyhttps://github.com/hideya...
- Follow the setup instructions to create an account (if required)
- Connect the MCP server to your Claude Desktop application
- Start using mcp-client-langchain-py capabilities within your Claude conversations
Additional Information
Created
January 12, 2025
Company
Start building your own MCP Server
Interested in creating your own MCP Server? Check out the official documentation and resources.
Learn More