LLaMa-MCP-Streamlit logo

LLaMa MCP Streamlit

by Nikunj2003

2 forks
13 stars
Free

What is LLaMa-MCP-Streamlit

LLaMa-MCP-Streamlit is an innovative interactive AI assistant developed by Nikunj2003. This tool leverages the power of Streamlit, NVIDIA NIM's API (LLaMa 3.3:70b)/Ollama, and the Model Control Protocol (MCP) to provide users with a seamless conversational interface. At its core, the server facilitates interactions with a large language model (LLM) that can execute real-time external tools via MCP, retrieve data, and perform a variety of actions efficiently. The assistant is designed to enhance user experience by allowing for custom model selection, API configuration for different backends, and real-time data processing through tool integration.

How to Use LLaMa-MCP-Streamlit

To start using LLaMa-MCP-Streamlit, you need to set up the environment and choose your preferred method of running the server. Here's a quick guide:

  1. Configure Environment Variables: Ensure that your .env file is set up with the necessary API keys for NVIDIA and Ollama to enable seamless integration.

  2. Running with Poetry:

    • Install dependencies using Poetry:
      poetry install
      
    • Launch the Streamlit app:
      poetry run streamlit run llama_mcp_streamlit/main.py
      
  3. Running with Docker:

    • Build the Docker image:
      docker build -t llama-mcp-assistant .
      
    • Run the Docker container:
      docker compose up
      
  4. Adjust MCP Server Configuration: Modify the utils/mcp_server.py file to select either NPX or Docker as your MCP server setup based on your needs.

Once set up, you can interact with the AI assistant through Streamlit's user-friendly chat interface, taking advantage of its real-time data retrieval and tool execution capabilities.

Key Features of LLaMa-MCP-Streamlit

  • Conversational Interface: Built with Streamlit, it offers a chat-based user experience that is intuitive and easy to use.

  • Custom Model Selection: Choose between NVIDIA NIM and Ollama models to fit your specific use case and requirements.

  • Real-time Tool Execution: Leverage the power of MCP to execute external tools and process data in real-time.

  • API Configuration: Customize the backend integration to suit various application needs.

  • Streamlined Deployment: Supports deployment via Docker for a containerized setup, making it easier to manage and scale.

  • Interactive UI Elements: Enjoy a visually appealing and interactive experience with Streamlit's UI components, designed for streamlined navigation and usage.

Whether you are looking to enhance productivity by integrating various tools or simply exploring the capabilities of a conversational AI assistant, LLaMa-MCP-Streamlit offers a comprehensive solution tailored to modern AI interaction needs.

How to Use

To use the LLaMa-MCP-Streamlit, follow these steps:

  1. Visit https://github.com/Nikunj...
  2. Follow the setup instructions to create an account (if required)
  3. Connect the MCP server to your Claude Desktop application
  4. Start using LLaMa-MCP-Streamlit capabilities within your Claude conversations

Additional Information

Created

February 7, 2025

Company

Nikunj2003

Start building your own MCP Server

Interested in creating your own MCP Server? Check out the official documentation and resources.

Learn More