Installation
Prerequisites
- Python 3.10+
- Ollama running locally — Download Ollama
Install from PyPI
That's it. Your MCP client (Windsurf, VS Code, etc.) will start the server automatically — you don't need to run it manually.
Configure Your IDE
Windsurf
Add to %USERPROFILE%\.codeium\windsurf\mcp_config.json (Windows) or ~/.codeium/windsurf/mcp_config.json (macOS/Linux):
{
"mcpServers": {
"ollama": {
"command": "py",
"args": ["-m", "mcp_ollama_python"],
"disabled": false
}
}
}
Restart Windsurf — the Ollama MCP server will appear in the MCP panel.
See the full Windsurf Integration guide for advanced setup.
VS Code
Add to your MCP settings:
{
"mcpServers": {
"ollama": {
"command": "py",
"args": ["-m", "mcp_ollama_python"],
"disabled": false
}
}
}
See the full VS Code Integration guide for details.
Windows Executable
If you prefer a standalone .exe (no Python required), download it from the Releases page.
Verify Installation
# Check Ollama is running
curl http://localhost:11434/api/tags
# Verify the module is installed
py -m mcp_ollama_python --help
Next Steps
- Configuration — Environment variables, custom hosts, model config
- Available Tools — All 8 MCP tools with examples