A minimal client app to interact with local OLLAMA models leveraging multiple MCP agent tools using BeeAI framework.
Below is a sample visual of this client app with chat interface, displaying the postgres database operation performed with thinking steps the AI has taken to use the right MCP agent and tranforming the request & response with LLM:
Install and serve ollama in your local machine with the following commands.
$ curl -fsSL https://ollama.com/install.sh | sh
$ ollama serve
$ ollama pull llama3.1
Add your MCP agents in the mcp-servers.json
file in root folder, for the app to pickup and work along with the LLM.
Make sure to update you postgres connection URL
If you want to use a different LLM model and LLM server, override the below properties before npm start
OLLAMA_CHAT_MODEL=llama3.1
OLLAMA_BASE_URL=http://localhost:11434/api
$ git clone https://github.com/tamdilip/mcp-ollama-beeai.git
$ cd mcp-ollama-beeai
$ npm i
$ npm start
Once the app is up and running, hit in Browser -> http://localhost:3000
Server
& tools
dropdown in UI.BeeAI
framework is used for ease setup of ReAct
(Reason And Act) agent with MCP tools.Markdown
JS library is used to render the responses in proper readable visual format.Happy coding :) !!