Skip to content

mudler/MCPs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

MCPS - Model Context Protocol Servers

This repository contains Model Context Protocol (MCP) servers that provide various tools and capabilities for AI models. It was mainly done to have small examples to show for LocalAI, but works as well with any MCP client.

Available Servers

πŸ¦† DuckDuckGo Search Server

A web search server that provides search capabilities using DuckDuckGo.

Features:

  • Web search functionality
  • Configurable maximum results (default: 5)
  • JSON schema validation for inputs/outputs

Tool:

  • search - Search the web for information

Configuration:

  • MAX_RESULTS - Environment variable to set maximum number of search results (default: 5)

Docker Image:

docker run -e MAX_RESULTS=10 ghcr.io/mudler/mcps/duckduckgo:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "ddg": {
          "command": "docker",
          "env": {
            "MAX_RESULTS": "10"
          },
          "args": [
            "run", "-i", "--rm", "-e", "MAX_RESULTS",
            "ghcr.io/mudler/mcps/duckduckgo:master"
          ]
        }
      }
    }

🌀️ Weather Server

A weather information server that provides current weather and forecast data for cities worldwide.

Features:

  • Current weather conditions (temperature, wind, description)
  • Multi-day weather forecast
  • URL encoding for city names with special characters
  • JSON schema validation for inputs/outputs
  • HTTP timeout handling

Tool:

  • get_weather - Get current weather and forecast for a city

API Response Format:

{
  "temperature": "29 Β°C",
  "wind": "20 km/h", 
  "description": "Partly cloudy",
  "forecast": [
    {
      "day": "1",
      "temperature": "27 Β°C",
      "wind": "12 km/h"
    },
    {
      "day": "2", 
      "temperature": "22 Β°C",
      "wind": "8 km/h"
    }
  ]
}

Docker Image:

docker run ghcr.io/mudler/mcps/weather:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "weather": {
          "command": "docker",
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/weather:master"
          ]
        }
      }
    }

🧠 Memory Server

A persistent memory storage server that allows AI models to store, retrieve, and manage information across sessions.

Features:

  • Persistent JSON file storage
  • Add, list, and remove memory entries
  • Unique ID generation for each entry
  • Timestamp tracking for entries
  • Configurable storage location
  • JSON schema validation for inputs/outputs

Tools:

  • add_memory - Add a new entry to memory storage
  • list_memory - List all memory entries
  • remove_memory - Remove a memory entry by ID
  • search_memory - Search memory entries by content (case-insensitive)

Configuration:

  • MEMORY_FILE_PATH - Environment variable to set the memory file path (default: /data/memory.json)

Memory Entry Format:

{
  "id": "1703123456789000000",
  "content": "User prefers coffee over tea",
  "created_at": "2023-12-21T10:30:56.789Z"
}

Search Response Format:

{
  "query": "coffee",
  "results": [
    {
      "id": "1703123456789000000",
      "content": "User prefers coffee over tea",
      "created_at": "2023-12-21T10:30:56.789Z"
    }
  ],
  "count": 1
}

Docker Image:

docker run -e MEMORY_FILE_PATH=/custom/path/memory.json ghcr.io/mudler/mcps/memory:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "memory": {
          "command": "docker",
          "env": {
            "MEMORY_FILE_PATH": "/data/memory.json"
          },
          "args": [
            "run", "-i", "--rm", "-v", "/host/data:/data",
            "ghcr.io/mudler/mcps/memory:master"
          ]
        }
      }
    }

🏠 Home Assistant Server

A Home Assistant integration server that allows AI models to interact with and control Home Assistant entities and services.

Features:

  • List all entities and their current states
  • Get all available services with detailed information
  • Call services to control devices (turn_on, turn_off, toggle, etc.)

Tools:

  • list_entities - List all entities in Home Assistant
  • get_services - Get all available services in Home Assistant
  • call_service - Call a service in Home Assistant (e.g., turn_on, turn_off, toggle)
  • search_entities - Search for entities by keyword (searches across entity ID, domain, state, and friendly name)
  • search_services - Search for services by keyword (searches across service domain and name)

Configuration:

  • HA_TOKEN - Home Assistant API token (required)
  • HA_HOST - Home Assistant host URL (default: http://localhost:8123)

Entity Response Format:

{
  "entities": [
    {
      "entity_id": "light.living_room",
      "state": "on",
      "friendly_name": "Living Room Light",
      "attributes": {
        "friendly_name": "Living Room Light",
        "brightness": 255
      },
      "domain": "light"
    }
  ],
  "count": 1
}

Service Call Example:

{
  "domain": "light",
  "service": "turn_on",
  "entity_id": "light.living_room"
}

Search Entities Example:

{
  "keyword": "living room light"
}

Search Services Example:

{
  "keyword": "turn_on"
}

Docker Image:

docker run -e HA_TOKEN="your-token-here" -e HA_HOST="http://IP:PORT" ghcr.io/mudler/mcps/homeassistant:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "homeassistant": {
          "command": "docker",
          "env": {
            "HA_TOKEN": "your-home-assistant-token",
            "HA_HOST": "http://"
          },
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/homeassistant:master"
          ]
        }
      }
    }

🐚 Shell Server

A shell script execution server that allows AI models to execute shell scripts and commands.

Features:

  • Execute shell scripts with full shell capabilities
  • Configurable shell command (default: sh -c)
  • Separate stdout and stderr capture
  • Exit code reporting
  • Configurable timeout (default: 30 seconds)
  • JSON schema validation for inputs/outputs

Tool:

  • execute_command - Execute a shell script and return the output, exit code, and any errors

Configuration:

  • SHELL_CMD - Environment variable to set the shell command to use (default: sh). Can include arguments, e.g., bash -x or zsh

Input Format:

{
  "script": "ls -la /tmp",
  "timeout": 30
}

Output Format:

{
  "script": "ls -la /tmp",
  "stdout": "total 1234\ndrwxrwxrwt...",
  "stderr": "",
  "exit_code": 0,
  "success": true,
  "error": ""
}

Docker Image:

docker run -e SHELL_CMD=bash ghcr.io/mudler/mcps/shell:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "shell": {
          "command": "docker",
          "env": {
            "SHELL_CMD": "bash"
          },
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/shell:master"
          ]
        }
      }
    }

πŸ” SSH Server

An SSH server that allows AI models to connect to remote SSH hosts and execute shell scripts.

Features:

  • Connect to remote SSH hosts
  • Execute shell scripts on remote hosts
  • Support for password and key-based authentication
  • Configurable remote shell command (default: sh -c)
  • Separate stdout and stderr capture
  • Exit code reporting
  • Configurable timeout (default: 30 seconds)
  • JSON schema validation for inputs/outputs

Tool:

  • execute_script - Execute a shell script on a remote SSH host and return the output, exit code, and any errors

Configuration:

  • SSH_HOST - Default SSH host (can be overridden per request)
  • SSH_PORT - Default SSH port (default: 22)
  • SSH_USER - Default SSH username (can be overridden per request)
  • SSH_PASSWORD - Default SSH password (can be overridden per request, or use SSH_KEY_PATH)
  • SSH_KEY_PATH - Path to SSH private key file (alternative to password authentication)
  • SSH_KEY_PASSPHRASE - Passphrase for encrypted SSH private key (if needed)
  • SSH_SHELL_CMD - Remote shell command to use (default: sh -c)

Input Format:

{
  "host": "example.com",
  "port": 22,
  "user": "username",
  "password": "password",
  "script": "ls -la /tmp",
  "timeout": 30
}

Or using key-based authentication:

{
  "host": "example.com",
  "user": "username",
  "key_path": "/path/to/private/key",
  "script": "ls -la /tmp",
  "timeout": 30
}

Output Format:

{
  "host": "example.com",
  "script": "ls -la /tmp",
  "stdout": "total 1234\ndrwxrwxrwt...",
  "stderr": "",
  "exit_code": 0,
  "success": true,
  "error": ""
}

Docker Image:

docker run -e SSH_HOST=example.com -e SSH_USER=user -e SSH_PASSWORD=pass ghcr.io/mudler/mcps/ssh:latest

Or with key-based authentication:

docker run -e SSH_HOST=example.com -e SSH_USER=user -e SSH_KEY_PATH=/path/to/key -v /host/keys:/path/to/key ghcr.io/mudler/mcps/ssh:latest

LocalAI configuration ( to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "ssh": {
          "command": "docker",
          "env": {
            "SSH_HOST": "example.com",
            "SSH_USER": "username",
            "SSH_PASSWORD": "password",
            "SSH_SHELL_CMD": "bash -c"
          },
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/ssh:master"
          ]
        }
      }
    }

πŸ”§ Script Runner Server

A flexible script and program execution server that allows AI models to run pre-defined scripts and programs as tools. Scripts can be defined inline or via file paths, and programs can be executed directly.

Features:

  • Execute scripts from file paths or inline content
  • Run arbitrary programs/commands
  • Automatic interpreter detection (shebang or file extension)
  • Configurable timeouts per script/program
  • Custom working directories and environment variables
  • Comprehensive output capture (stdout, stderr, exit code, duration)

Configuration:

  • SCRIPTS - JSON string defining scripts/programs (required)

Script Configuration Format:

[
  {
    "name": "hello_world",
    "description": "A simple hello world script",
    "content": "#!/bin/bash\necho 'Hello, World!'",
    "timeout": 10
  },
  {
    "name": "run_python",
    "description": "Run a Python script from file",
    "path": "/scripts/process_data.py",
    "interpreter": "python3",
    "timeout": 30,
    "working_dir": "/data"
  },
  {
    "name": "list_files",
    "description": "List files in a directory",
    "command": "ls",
    "timeout": 5
  }
]

Executor Object Fields:

  • name (string, required): Tool name (must be valid identifier)
  • description (string, required): Tool description
  • content (string, optional): Inline script content (mutually exclusive with path and command)
  • path (string, optional): Path to script file (mutually exclusive with content and command)
  • command (string, optional): Command/program to execute (mutually exclusive with content and path)
  • interpreter (string, optional): Interpreter to use (default: auto-detect from shebang or file extension)
  • timeout (int, optional): Timeout in seconds (default: 30)
  • working_dir (string, optional): Working directory for execution
  • env (map[string]string, optional): Additional environment variables

Execution Input:

{
  "args": ["arg1", "arg2"]
}

Execution Output:

{
  "stdout": "Hello, World!\n",
  "stderr": "",
  "exit_code": 0,
  "duration_ms": 15
}

Docker Image:

docker run -e SCRIPTS='[{"name":"hello","description":"Hello script","content":"#!/bin/bash\necho hello"}]' ghcr.io/mudler/mcps/scripts:latest

LocalAI configuration (to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "scripts": {
          "command": "docker",
          "env": {
            "SCRIPTS": "[{\"name\":\"hello\",\"description\":\"Hello script\",\"content\":\"#!/bin/bash\\necho hello\"},{\"name\":\"list_files\",\"description\":\"List files\",\"command\":\"ls\"}]"
          },
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/scripts:master"
          ]
        }
      }
    }

πŸ“š LocalRecall Server

A knowledge base management server that provides tools to interact with LocalRecall's REST API for managing collections, searching content, and managing documents.

Features:

  • Search content in collections
  • Create and reset collections
  • Add documents to collections
  • List collections and files
  • Delete entries from collections
  • Configurable tool enablement for security

Tools:

  • search - Search content in a LocalRecall collection
  • create_collection - Create a new collection
  • reset_collection - Reset (clear) a collection
  • add_document - Add a document to a collection
  • list_collections - List all collections
  • list_files - List files in a collection
  • delete_entry - Delete an entry from a collection

Configuration:

  • LOCALRECALL_URL - Base URL for LocalRecall API (default: http://localhost:8080)
  • LOCALRECALL_API_KEY - Optional API key for authentication (sent as Authorization: Bearer <key>)
  • LOCALRECALL_COLLECTION - Default collection name (if set, tools are registered without collection_name parameter - the collection is automatically used from the environment variable)
  • LOCALRECALL_ENABLED_TOOLS - Comma-separated list of tools to enable (default: all tools enabled). Valid values: search, create_collection, reset_collection, add_document, list_collections, list_files, delete_entry

Note: When LOCALRECALL_COLLECTION is set, the tools search, add_document, list_files, and delete_entry are registered with different input schemas that do not include the collection_name parameter. The collection name is automatically taken from the environment variable.

Search Input Format:

When LOCALRECALL_COLLECTION is not set:

{
  "collection_name": "myCollection",
  "query": "search term",
  "max_results": 5
}

When LOCALRECALL_COLLECTION is set (e.g., LOCALRECALL_COLLECTION=myCollection), the tool schema does not include collection_name:

{
  "query": "search term",
  "max_results": 5
}

Search Output Format:

{
  "query": "search term",
  "max_results": 5,
  "results": [
    {
      "content": "...",
      "metadata": {...}
    }
  ],
  "count": 1
}

Add Document Input Format:

When LOCALRECALL_COLLECTION is not set:

{
  "collection_name": "myCollection",
  "file_path": "/path/to/file.txt",
  "filename": "file.txt"
}

Or with inline content:

{
  "collection_name": "myCollection",
  "file_content": "Document content here",
  "filename": "document.txt"
}

When LOCALRECALL_COLLECTION is set, the tool schema does not include collection_name:

{
  "file_path": "/path/to/file.txt",
  "filename": "file.txt"
}

List Files Input Format:

When LOCALRECALL_COLLECTION is not set:

{
  "collection_name": "myCollection"
}

When LOCALRECALL_COLLECTION is set, the tool schema has no parameters (empty object):

{}

Delete Entry Input Format:

When LOCALRECALL_COLLECTION is not set:

{
  "collection_name": "myCollection",
  "entry": "filename.txt"
}

When LOCALRECALL_COLLECTION is set, the tool schema does not include collection_name:

{
  "entry": "filename.txt"
}

Docker Image:

docker run -e LOCALRECALL_URL=http://localhost:8080 -e LOCALRECALL_API_KEY=your-key-here ghcr.io/mudler/mcps/localrecall:latest

With default collection (tools will not require collection_name parameter):

docker run -e LOCALRECALL_URL=http://localhost:8080 -e LOCALRECALL_COLLECTION=myCollection ghcr.io/mudler/mcps/localrecall:latest

When LOCALRECALL_COLLECTION is set, the collection-specific tools (search, add_document, list_files, delete_entry) are automatically configured to use that collection, and the collection_name parameter is removed from their input schemas.

Enable specific tools only:

docker run -e LOCALRECALL_URL=http://localhost:8080 -e LOCALRECALL_ENABLED_TOOLS="search,list_collections,list_files" ghcr.io/mudler/mcps/localrecall:latest

LocalAI configuration (to add to the model config):

mcp:
  stdio: |
    {
      "mcpServers": {
        "localrecall": {
          "command": "docker",
          "env": {
            "LOCALRECALL_URL": "http://localhost:8080",
            "LOCALRECALL_API_KEY": "your-api-key",
            "LOCALRECALL_COLLECTION": "myCollection",
            "LOCALRECALL_ENABLED_TOOLS": "search,list_collections,add_document"
          },
          "args": [
            "run", "-i", "--rm",
            "ghcr.io/mudler/mcps/localrecall:master"
          ]
        }
      }
    }

Development

Prerequisites

  • Go 1.24.7 or later
  • Docker (for containerized builds)
  • Make (for using the Makefile)

Building

Use the provided Makefile for easy development:

# Show all available commands
make help

# Development workflow
make dev

# Build specific server
make MCP_SERVER=duckduckgo build
make MCP_SERVER=weather build
make MCP_SERVER=memory build
make MCP_SERVER=shell build
make MCP_SERVER=ssh build
make MCP_SERVER=scripts build
make MCP_SERVER=localrecall build

# Run tests and checks
make ci-local

# Build multi-architecture images
make build-multiarch

Adding New Servers

To add a new MCP server:

  1. Create a new directory under the project root
  2. Implement the server following the MCP SDK patterns
  3. Update the GitHub Actions workflow matrix in .github/workflows/image.yml
  4. Update this README with the new server information

Example server structure:

package main

import (
    "context"
    "log"
    "github.com/modelcontextprotocol/go-sdk/mcp"
)

func main() {
    server := mcp.NewServer(&mcp.Implementation{
        Name: "your-server", 
        Version: "v1.0.0"
    }, nil)
    
    // Add your tools here
    mcp.AddTool(server, &mcp.Tool{
        Name: "your-tool", 
        Description: "your tool description"
    }, YourToolFunction)
    
    if err := server.Run(context.Background(), &mcp.StdioTransport{}); err != nil {
        log.Fatal(err)
    }
}

Docker Images

Docker images are automatically built and pushed to GitHub Container Registry:

  • ghcr.io/mudler/mcps/duckduckgo:latest - Latest DuckDuckGo server
  • ghcr.io/mudler/mcps/duckduckgo:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/duckduckgo:master - Development versions
  • ghcr.io/mudler/mcps/weather:latest - Latest Weather server
  • ghcr.io/mudler/mcps/weather:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/weather:master - Development versions
  • ghcr.io/mudler/mcps/memory:latest - Latest Memory server
  • ghcr.io/mudler/mcps/memory:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/memory:master - Development versions
  • ghcr.io/mudler/mcps/shell:latest - Latest Shell server
  • ghcr.io/mudler/mcps/shell:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/shell:master - Development versions
  • ghcr.io/mudler/mcps/ssh:latest - Latest SSH server
  • ghcr.io/mudler/mcps/ssh:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/ssh:master - Development versions
  • ghcr.io/mudler/mcps/homeassistant:latest - Latest Home Assistant server
  • ghcr.io/mudler/mcps/homeassistant:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/homeassistant:master - Development versions
  • ghcr.io/mudler/mcps/scripts:latest - Latest Script Runner server
  • ghcr.io/mudler/mcps/scripts:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/scripts:master - Development versions
  • ghcr.io/mudler/mcps/localrecall:latest - Latest LocalRecall server
  • ghcr.io/mudler/mcps/localrecall:v1.0.0 - Tagged versions
  • ghcr.io/mudler/mcps/localrecall:master - Development versions

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Run make ci-local to ensure all checks pass
  6. Submit a pull request

License

This project is licensed under the terms specified in the LICENSE file.

Model Context Protocol

This project implements servers for the Model Context Protocol (MCP), a standard for connecting AI models to external data sources and tools.

For more information about MCP, visit the official documentation.

About

personal MCPs that I use with LocalAI

Resources

License

Stars

Watchers

Forks

Packages