Development
This guide is for contributors who want to work on the mcp-ollama-python codebase itself. If you just want to use the server, see Installation.
Dev Setup
# Clone and install with all dependency groups
git clone https://github.com/pblagoje/mcp-ollama-python.git
cd mcp-ollama-python
py -m poetry install
# Run the server locally
py -m poetry run mcp-ollama-python
Testing
Code Quality
# Format
py -m poetry run black src/
# Lint
py -m poetry run flake8 src/
# Pre-commit hooks (install once, runs on every commit)
py -m poetry run pre-commit install
py -m poetry run pre-commit run --all-files
Building the Windows Executable
The spec file reads the version from pyproject.toml and produces an EXE named like mcp-ollama-python-1.0.3-win11-x64.exe.
Building the Docs
# Install docs dependencies
py -m poetry install --with docs
# Live preview
py -m poetry run mkdocs serve
# Build static site
py -m poetry run mkdocs build --strict
Docs are auto-deployed to GitHub Pages on push to main via the docs.yml workflow.
Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Write tests for your changes
- Commit with clear messages (
git commit -m 'Add amazing feature') - Push to your branch (
git push origin feature/amazing-feature) - Open a Pull Request
Code Quality Standards
- All new tools must export
tool_definition - Maintain comprehensive test coverage
- Follow existing Python patterns (Black formatting, Pydantic schemas)
- See Architecture for how to add new tools
Related Projects
- ollama-mcp (TypeScript) — Original TypeScript implementation
- Ollama — Get up and running with large language models locally
- Model Context Protocol — Open standard for AI assistant integration
- Windsurf — AI-powered code editor with MCP support
- Cline — VS Code AI assistant