mcp-ollama-beeai

🤖 mcp-ollama-beeai

A minimal client app to interact with local OLLAMA models leveraging multiple MCP agent tools using BeeAI framework.

Below is a sample visual of this client app with chat interface, displaying the postgres database operation performed with thinking steps the AI has taken to use the right MCP agent and tranforming the request & response with LLM: demo-pic

Usage

📋 Pre-requisite

1. Local ollama server

Install and serve ollama in your local machine with the following commands.

        $ curl -fsSL https://ollama.com/install.sh | sh
        $ ollama serve
        $ ollama pull llama3.1

2. MCP servers list configuration

Add your MCP agents in the mcp-servers.json file in root folder, for the app to pickup and work along with the LLM.

3 .env

If you want to use a different LLM model and LLM server, override the below properties before npm start

        OLLAMA_CHAT_MODEL=llama3.1
        OLLAMA_BASE_URL=http://localhost:11434/api

🎮 Boot up your app

        $ git clone https://github.com/tamdilip/mcp-ollama-beeai.git
        $ cd mcp-ollama-beeai
        $ npm i
        $ npm start

Once the app is up and running, hit in Browser -> http://localhost:3000

Additional Context:

Happy coding :) !!