Summary
Expose Wiggum's project scanning, interview orchestration, and spec generation as MCP (Model Context Protocol) tools, making them available to any MCP-compatible agent or IDE (Claude Code, Cursor, Windsurf, VS Code, etc.).
Problem / Context
MCP is becoming the standard protocol for extending AI coding agents. By exposing Wiggum's unique capabilities as MCP tools, we make them available to every MCP-compatible environment without requiring users to install or switch to the Wiggum CLI.
Key architectural advantage: Competitors wrap agents (run on top of them). Wiggum can run inside agents via MCP — a fundamentally different integration model that's more natural and less disruptive to existing workflows. Combined with the skill.md distribution strategy, this creates a two-pronged distribution approach: skills for quick adoption, MCP for deep integration.
Roadmap phase: Phase 5 — DISTRIBUTE ("Wiggum meets users where they already are")
Proposed Solution
MCP Tools to Expose
| Tool |
Description |
wiggum_scan |
Run project scanner, return structured detection results |
wiggum_analyze |
Run AI-enhanced codebase analysis |
wiggum_interview_start |
Start an interactive interview session for a feature |
wiggum_interview_answer |
Submit an answer to the current interview question |
wiggum_interview_generate |
Generate spec from completed interview |
wiggum_list_specs |
List available specs in the project |
wiggum_read_spec |
Read a specific spec file |
MCP Resources to Expose
| Resource |
Description |
wiggum://context |
Current .ralph/.context.json |
wiggum://specs/{name} |
Individual spec files |
wiggum://config |
Current ralph.config.cjs |
Architecture
- Standalone MCP server process (
wiggum mcp-server or npx wiggum-mcp)
- Uses stdio transport (standard for local MCP servers)
- Reuses existing programmatic API (
Scanner, Generator, ConversationManager)
- Stateful session for multi-turn interview flow
Files to Modify
| File |
Changes |
src/mcp/ (new) |
MCP server implementation, tool handlers, resource providers |
src/index.ts |
Add mcp-server command routing |
package.json |
Add MCP server entry point, @modelcontextprotocol/sdk dependency |
Acceptance Criteria
Summary
Expose Wiggum's project scanning, interview orchestration, and spec generation as MCP (Model Context Protocol) tools, making them available to any MCP-compatible agent or IDE (Claude Code, Cursor, Windsurf, VS Code, etc.).
Problem / Context
MCP is becoming the standard protocol for extending AI coding agents. By exposing Wiggum's unique capabilities as MCP tools, we make them available to every MCP-compatible environment without requiring users to install or switch to the Wiggum CLI.
Key architectural advantage: Competitors wrap agents (run on top of them). Wiggum can run inside agents via MCP — a fundamentally different integration model that's more natural and less disruptive to existing workflows. Combined with the skill.md distribution strategy, this creates a two-pronged distribution approach: skills for quick adoption, MCP for deep integration.
Roadmap phase: Phase 5 — DISTRIBUTE ("Wiggum meets users where they already are")
Proposed Solution
MCP Tools to Expose
wiggum_scanwiggum_analyzewiggum_interview_startwiggum_interview_answerwiggum_interview_generatewiggum_list_specswiggum_read_specMCP Resources to Expose
wiggum://context.ralph/.context.jsonwiggum://specs/{name}wiggum://configralph.config.cjsArchitecture
wiggum mcp-serverornpx wiggum-mcp)Scanner,Generator,ConversationManager)Files to Modify
src/mcp/(new)src/index.tsmcp-servercommand routingpackage.json@modelcontextprotocol/sdkdependencyAcceptance Criteria
wiggum mcp-serverstarts a stdio-based MCP serverclaude mcp add wiggum