Mume AIMume AI
DocsModelsPricingChat
OverviewChat CompletionsResponses APIStreamingFunction CallingWeb SearchMCP ServersModelsError HandlingAuthentication

MCP Servers

Connect local tools to Mume AI using the Model Context Protocol (MCP). Give the model access to your filesystem, databases, APIs, and anything else you can expose as an MCP server.

What is MCP?

MCP is an open protocol that lets AI applications connect to external tools and data sources. An MCP server exposes tools that the model can call during a conversation — reading files, querying databases, searching the web, and more.

Mume AI supports connecting to any MCP server that speaks the Streamable HTTP transport (the /mcp endpoint pattern).

Quick Start — Local Filesystem Server

The fastest way to try MCP is to give the model read/write access to a folder on your machine using the official @modelcontextprotocol/server-filesystem package and mcp-proxy.

1. Start the proxy + server

Open a terminal and run a single command. Replace /path/to/folder with the directory you want to expose:

Bash
npx mcp-proxy --port 8081 -- npx -y @modelcontextprotocol/server-filesystem /path/to/folder

For example, to expose your Desktop:

Bash
npx mcp-proxy --port 8081 -- npx -y @modelcontextprotocol/server-filesystem ~/Desktop

This starts a proxy on port 8081 that wraps the filesystem MCP server and exposes it over Streamable HTTP at http://localhost:8081/mcp.

2. Allow local network access

When you first connect, your browser will ask: "mume.ai wants to look for and connect to devices on your local network." Click Allow — this lets the browser reach localhost.

3. Add the server in Mume

  1. Open a chat on mume.ai
  2. Click the + button in the chat input area
  3. Go to the MCP tab
  4. Enter a server name, e.g. fs
  5. Enter the URL: http://localhost:8081/mcp
  6. Click the plug icon to connect

Once connected, the server will show a green dot and list all available tools (e.g. read_file, write_file, list_directory, etc.). You can toggle individual tools on or off.

How It Works

1mcp-proxy starts the MCP server as a child process using stdio transport.
2It exposes a Streamable HTTP endpoint at /mcp on the port you specify.
3Mume AI connects to that endpoint from your browser, discovers available tools, and makes them available to the model.
4When the model decides to use a tool, the call is routed through the proxy to the MCP server running on your machine.

More MCP Servers

Any stdio-based MCP server can be exposed via mcp-proxy. Here are some popular ones:

Memory (Knowledge Graph)

Bash
npx mcp-proxy --port 8084 -- npx -y @modelcontextprotocol/server-memory

Browse the full list on github.com/modelcontextprotocol/servers and Awesome MCP Servers.

Using a Remote or Already-HTTP MCP Server

If the MCP server already exposes a Streamable HTTP endpoint (no proxy needed), just enter its URL directly in the MCP tab — e.g. https://mcp.example.com/mcp.

Troubleshooting

Browser blocks the connection

Make sure you clicked Allow on the local network access prompt. If you accidentally blocked it, go to your browser settings and reset the permission for mume.ai.

Connection refused

  • Verify the proxy is running: you should see output in the terminal where you ran npx mcp-proxy.
  • Check the port matches what you entered in the MCP tab.
  • Ensure nothing else is using the same port.

CORS errors

mcp-proxy includes CORS headers by default. If you're running your own HTTP server, make sure it returns Access-Control-Allow-Origin: * (or the specific origin).

Tools not appearing

  • Expand the server entry in the MCP tab — tools are listed inside.
  • Check the terminal for errors from the MCP server process.
  • Try disconnecting and reconnecting (click the plug icon).

Security Notes

  • MCP servers run on your machine — only expose directories and resources you're comfortable sharing with the model.
  • Local servers bind to localhost by default and are not accessible from the internet.
  • Tool calls are executed locally; Mume's servers never see your files directly.

← Function CallingModels →