This repository contains a Model Control Plane (MCP) server implementation that supports OpenAI services, Git repository analysis, local filesystem operations, and Prometheus integration.
MCP/
├── mcp/ # Core MCP library modules
├── scripts/ # Utility scripts and test tools
├── prometheus/ # Prometheus configuration
├── docker-compose.yml # Docker configuration
├── mcp_server.py # Main server implementation
├── mcp_run # Main runner script (shortcut)
└── README.md # This file
- Python 3.8+
- FastAPI
- Uvicorn
- OpenAI SDK
- GitPython
- Requests
- Docker and Docker Compose (for Prometheus features)
- Clone this repository
- Install the dependencies:
pip install -r requirements.txtSet the following environment variables:
For Azure OpenAI:
export AZURE_OPENAI_ENDPOINT="your-azure-endpoint"
export AZURE_OPENAI_API_KEY="your-azure-api-key"
export AZURE_OPENAI_API_VERSION="2023-05-15"
export AZURE_DEPLOYMENT_NAME="your-deployment-name"For Standard OpenAI:
export OPENAI_API_KEY="your-openai-api-key"
# Optional: Specify which models to use
export OPENAI_CHAT_MODEL="gpt-4o-mini" # Default if not specified
export OPENAI_COMPLETION_MODEL="gpt-3.5-turbo-instruct" # Default if not specifiedFor Prometheus:
export PROMETHEUS_URL="http://localhost:9090" # Default if not specifiedStart the MCP server:
python scripts/start_mcp_server.pyOr for more options:
python scripts/start_mcp_server.py --host 0.0.0.0 --port 8000 --debugThe server will be available at http://localhost:8000.
We provide a unified testing script that gives you a user-friendly interface to all testing functionality:
./mcp_runThis interactive script provides:
- Filesystem tests
- Git integration tests
- Memory analysis tools
- Prometheus tests & memory stress
- MCP server management
- Environment setup
You can also run individual tests directly:
Test the OpenAI integration:
python scripts/test_mcp_client.pyTest the Git integration (provide a Git repository URL):
python scripts/test_git_integration.py https://github.com/username/repositoryTest the Git diff functionality (analyze requirements compatibility):
python scripts/test_git_diff.py https://github.com/username/repository [commit-sha]Test the filesystem functionality:
python scripts/test_filesystem.pyTest the langflow integration with MCP:
python scripts/test_langflow_integration.py [OPTIONAL_REPO_URL]Test the Prometheus integration:
python scripts/test_prometheus.py [prometheus_url]For more advanced Git repository analysis with AI recommendations:
python scripts/langflow_git_analyzer.py https://github.com/username/repositoryYou can also search for specific patterns in the repository:
python scripts/langflow_git_analyzer.py https://github.com/username/repository --search "def main"Or analyze the last commit diff with AI insights:
python scripts/langflow_git_analyzer.py https://github.com/username/repository --diffMCP includes several tools for memory monitoring and analysis:
# Basic memory diagnostics with AI analysis
python scripts/ai_memory_diagnostics.py
# Interactive memory dashboard
python scripts/mcp_memory_dashboard.py
# Memory alerting system
python scripts/mcp_memory_alerting.pyYou can also simulate memory pressure for testing:
python scripts/simulate_memory_pressure.py --target 85 --duration 300- Start the Prometheus stack using Docker Compose:
docker compose up -dThis will start:
- Prometheus server (accessible at http://localhost:9090)
- Node Exporter (for host metrics)
- cAdvisor (for container metrics)
- For stress testing, you can start the memory stress container:
docker compose up -d --build memory-stressOr use the container test script:
./scripts/container-memory-test.sh startThis project includes multiple Docker configurations and reset scripts for reliable operation across different environments:
- Standard Configuration (
docker-compose.yml): Uses custom Dockerfiles for Prometheus and Langflow to ensure consistent permissions across systems. - Bridge Network Configuration (
docker-compose.bridge.yml): Alternative configuration that uses bridge networking for environments where host networking is problematic.
The project uses custom Dockerfiles for both Prometheus and Langflow to solve common permission issues:
- Dockerfile.prometheus: Sets up the Prometheus configuration with proper permissions for the
nobodyuser. - Dockerfile.langflow: Copies the components directory into the container without changing file ownership, allowing Langflow to access the components without permission errors.
This approach eliminates the need for volume mounts that can lead to permission conflicts across different machines and user configurations.
-
All Services Reset (
reset-all.sh): Reset all containers with a single command.# Basic reset (rebuilds containers with existing volumes) ./reset-all.sh # Full reset (removes volumes and rebuilds containers) ./reset-all.sh --clean
-
Individual Service Reset:
# Reset only Prometheus ./reset-prometheus.sh # Reset only Langflow ./reset-langflow.sh
These scripts ensure that the containers are properly configured with correct permissions and the latest code changes.
If you encounter permission issues:
- Use the reset scripts to rebuild the containers
- Check the logs with
docker compose logs <service_name> - Make sure any components added to Langflow are included in the Dockerfile.langflow
When deploying to a new machine:
- Clone the repository
- Make reset scripts executable:
chmod +x *.sh - Run the reset script:
./reset-all.sh
The custom Dockerfiles automatically handle all permission issues that might occur across different systems.
The MCPAIComponent class includes Prometheus capabilities:
from langflow import MCPAIComponent
# Initialize the client
mcp = MCPAIComponent(mcp_server_url="http://localhost:8000")
# Instant query (current metric values)
result = mcp.prometheus_query("up")
# Range query (metrics over time)
result = mcp.prometheus_query_range(
query="rate(node_cpu_seconds_total{mode='system'}[1m])",
start="2023-03-01T00:00:00Z",
end="2023-03-01T01:00:00Z",
step="15s"
)
# Get all labels
labels = mcp.prometheus_get_labels()
# Get label values
values = mcp.prometheus_get_label_values("job")
# Get targets
targets = mcp.prometheus_get_targets()
# Get alerts
alerts = mcp.prometheus_get_alerts()- CPU Usage:
rate(node_cpu_seconds_total{mode!="idle"}[1m]) - Memory Usage:
node_memory_MemTotal_bytes - node_memory_MemAvailable_bytes - Disk Usage:
node_filesystem_avail_bytes{mountpoint="/"} / node_filesystem_size_bytes{mountpoint="/"} - Container CPU Usage:
rate(container_cpu_usage_seconds_total[1m]) - Container Memory Usage:
container_memory_usage_bytes
- GET
/v1/models- List all available models - GET
/v1/models/{model_id}- Get information about a specific model - POST
/v1/models/azure-gpt-4/completion- Generate text completion using Azure OpenAI - POST
/v1/models/azure-gpt-4/chat- Generate chat response using Azure OpenAI - POST
/v1/models/openai-gpt-chat/chat- Generate chat response using OpenAI chat model - POST
/v1/models/openai-gpt-completion/completion- Generate text completion using OpenAI completion model
- POST
/v1/models/git-analyzer/analyze- Analyze a Git repository - POST
/v1/models/git-analyzer/search- Search a Git repository for files matching a pattern - POST
/v1/models/git-analyzer/diff- Get the diff of the last commit in a repository
- POST
/v1/models/filesystem/list- List contents of a directory - POST
/v1/models/filesystem/read- Read a file's contents - POST
/v1/models/filesystem/read-multiple- Read multiple files at once - POST
/v1/models/filesystem/write- Write content to a file - POST
/v1/models/filesystem/edit- Edit a file with multiple replacements - POST
/v1/models/filesystem/mkdir- Create a directory - POST
/v1/models/filesystem/move- Move a file or directory - POST
/v1/models/filesystem/search- Search for files matching a pattern - POST
/v1/models/filesystem/info- Get information about a file or directory
- POST
/v1/models/prometheus/query- Execute an instant query - POST
/v1/models/prometheus/query_range- Execute a range query - POST
/v1/models/prometheus/series- Get series data - GET
/v1/models/prometheus/labels- Get all available labels - POST
/v1/models/prometheus/label_values- Get values for a specific label - GET
/v1/models/prometheus/targets- Get all targets - GET
/v1/models/prometheus/rules- Get all rules - GET
/v1/models/prometheus/alerts- Get all alerts
You can use the MCPAIComponent in your LangFlow pipelines by providing the MCP server URL:
from langflow import MCPAIComponent
mcp = MCPAIComponent(mcp_server_url="http://localhost:8000")
# List available models
models = mcp.list_models()
print(models)
# Generate chat completion with OpenAI model
chat_response = mcp.chat(
model_id="openai-gpt-chat",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Tell me a joke about programming."}
],
max_tokens=100,
temperature=0.7
)
print(chat_response)
# Generate text completion with OpenAI model
completion_response = mcp.completion(
model_id="openai-gpt-completion",
prompt="Write a function in Python to calculate the factorial of a number:",
max_tokens=150,
temperature=0.7
)
print(completion_response)
# Analyze a Git repository
repo_analysis = mcp.analyze_git_repo("https://github.com/username/repository")
print(repo_analysis)
# Search a Git repository
search_results = mcp.search_git_repo("https://github.com/username/repository", "def main")
print(search_results)
# Get the diff of the last commit
diff_info = mcp.get_git_diff("https://github.com/username/repository")
print(diff_info)
# List files in the current directory
dir_contents = mcp.list_directory()
print(dir_contents)
# Read a file
file_content = mcp.read_file("path/to/file.txt")
print(file_content)
# Write to a file
write_result = mcp.write_file("path/to/new_file.txt", "Hello, world!")
print(write_result)
# Search for files
search_result = mcp.search_files("*.py")
print(search_result)For more structured Git analysis, you can use the GitCodeAnalyzer class:
from langflow_git_analyzer import GitCodeAnalyzer
# Initialize the analyzer
analyzer = GitCodeAnalyzer(mcp_server_url="http://localhost:8000")
# Analyze a repository
analyzer.analyze_repository("https://github.com/username/repository")
# Get a summary
summary = analyzer.get_repository_summary()
print(summary)
# Get AI recommendations
recommendations = analyzer.get_repository_recommendations()
print(recommendations)
# Analyze code patterns
pattern_analysis = analyzer.analyze_code_pattern("def process")
print(pattern_analysis)
# Get the last commit diff
diff_info = analyzer.get_last_commit_diff()
print(diff_info)
# Get a formatted summary of the diff
diff_summary = analyzer.get_formatted_diff_summary()
print(diff_summary)
# Get AI analysis of the commit changes
diff_analysis = analyzer.analyze_commit_diff()
print(diff_analysis)- Verify Prometheus is running:
docker ps | grep prometheus - Check you can access the Prometheus UI: http://localhost:9090
- Verify the MCP server is running and accessible
- Check the MCP server logs for errors
- Try simple queries first to verify connectivity (e.g.,
upquery)
- Verify your API keys are set correctly
- Check for rate limiting or quota issues
- Verify you're using supported models for your API key
- Ensure the Git repository URL is accessible
- Check for authentication issues if using private repositories
- Ensure GitPython is installed correctly