A feature-complete Model Context Protocol (MCP) server template in Python using FastMCP. This starter demonstrates all major MCP features with clean, Pythonic code.
| Category | Feature | Description |
|---|---|---|
| Tools | hello |
Basic tool with annotations |
get_weather |
Tool returning structured data | |
ask_llm |
Tool that invokes LLM sampling | |
long_task |
Tool with 5-second progress updates | |
load_bonus_tool |
Dynamically loads a new tool | |
| Resources | info://about |
Static informational resource |
file://example.md |
File-based markdown resource | |
| Templates | greeting://{name} |
Personalized greeting |
data://items/{id} |
Data lookup by ID | |
| Prompts | greet |
Greeting in various styles |
code_review |
Code review with focus areas |
- Python 3.11+
- uv (recommended) or pip
# Clone the repository
git clone https://github.com/SamMorrowDrums/mcp-python-starter.git
cd mcp-python-starter
# Install with uv (recommended)
uv sync
# Or with pip
pip install -e .stdio transport (for local development):
uv run mcp-python-starter --stdioHTTP transport (for remote/web deployment):
uv run mcp-python-starter --http --port 3000This project includes VS Code configuration for seamless development:
- Open the project in VS Code
- The MCP configuration is in
.vscode/mcp.json - Test the server using VS Code's MCP tools
- Install the Dev Containers extension
- Open command palette: "Dev Containers: Reopen in Container"
- Everything is pre-configured and ready to use!
.
βββ mcp_starter/
β βββ __init__.py
β βββ tools.py # Tool definitions (hello, get_weather, ask_llm, etc.)
β βββ resources.py # Resource and template definitions
β βββ prompts.py # Prompt definitions
β βββ server.py # Server orchestration (imports and wires modules)
βββ .vscode/
β βββ mcp.json # MCP server configuration
β βββ settings.json # Python settings
β βββ extensions.json
βββ .devcontainer/
β βββ devcontainer.json
βββ pyproject.toml # Project configuration (uv/pip, Ruff config)
βββ .python-version
# Run the server (Python reloads automatically on changes)
uv run mcp-python-starter --stdio
# Use MCP Inspector for debugging
uv run mcp dev mcp_starter/server.py
# Format code
uv run ruff format .
# Lint
uv run ruff check .
# Lint with auto-fix
uv run ruff check --fix .
# Type check
uv run pyrightPython scripts reload automatically when run with uv run. For enhanced debugging,
use mcp dev which provides the MCP Inspector UI.
The MCP Inspector is an essential development tool for testing and debugging MCP servers.
npx @modelcontextprotocol/inspector -- uv run mcp-python-starter- Tools Tab: List and invoke all registered tools with parameters
- Resources Tab: Browse and read resources and templates
- Prompts Tab: View and test prompt templates
- Logs Tab: See JSON-RPC messages between client and server
- Schema Validation: Verify tool input/output schemas
- Start Inspector before connecting your IDE/client
- Use the "Logs" tab to see exact request/response payloads
- Test tool annotations (ToolAnnotations) are exposed correctly
- Verify progress notifications appear for
long_task - Check that Context injection works for sampling tools
@mcp.tool(
title="Say Hello",
description="A friendly greeting tool",
annotations={"readOnlyHint": True},
)
def hello(name: str) -> str:
"""Say hello to someone.
Args:
name: The name to greet
"""
return f"Hello, {name}!"@mcp.resource("greeting://{name}")
def greeting_template(name: str) -> str:
"""Generate a personalized greeting."""
return f"Hello, {name}!"@mcp.tool(title="Long Task")
async def long_task(
task_name: str,
ctx: Context[ServerSession, None],
) -> str:
for i in range(5):
await ctx.report_progress(
progress=i / 5,
total=1.0,
message=f"Step {i + 1}/5",
)
await asyncio.sleep(1.0)
return "Done!"@mcp.tool(title="Ask LLM")
async def ask_llm(
prompt: str,
ctx: Context[ServerSession, None],
) -> str:
result = await ctx.session.create_message(
messages=[{"role": "user", "content": {"type": "text", "text": prompt}}],
max_tokens=100,
)
return result.content.textCopy .env.example to .env and configure:
cp .env.example .envContributions welcome! Please ensure your changes maintain feature parity with other language starters.
MIT License - see LICENSE for details.