Skip to content

localgpt-app/localgpt-gen-workspace

Repository files navigation

LocalGPT Workspace

A persistent memory and skills directory for AI agents.

Documentation

Structure

├── MEMORY.md          # Long-term curated knowledge
├── HEARTBEAT.md       # Pending tasks for autonomous mode
├── SOUL.md            # AI persona and behavioral guidelines
├── LocalGPT.md        # Standing instructions for conversations
├── memory/            # Daily logs and notes
├── skills/            # Custom skills (world generation, etc.)
├── .mcp.json          # MCP server config for Claude CLI / Codex
├── .gemini/
│   └── settings.json  # MCP server config for Gemini CLI
└── .claude/
    └── settings.local.json  # Claude CLI permissions and MCP settings

See CLAUDE.md for detailed documentation on skills format and world.ron structure.

CLI Backend MCP Configuration

When using LocalGPT Gen interactively with a CLI backend (Claude CLI, Gemini CLI, Codex), these configs tell the CLI to connect to the existing Bevy window via MCP relay (--connect) instead of spawning a new one.

.mcp.json — Claude CLI / Codex

{
  "mcpServers": {
    "localgpt-gen": {
      "command": "localgpt-gen",
      "args": ["mcp-server", "--connect"]
    }
  }
}

.gemini/settings.json — Gemini CLI

{
  "mcpServers": {
    "localgpt-gen": {
      "command": "localgpt-gen",
      "args": ["mcp-server", "--connect"]
    }
  }
}

--connect makes the spawned MCP server process relay tool calls to the running gen process's TCP port (default 9878) instead of creating its own Bevy window. Remove --connect if you want standalone MCP mode (each CLI gets its own window).

See CLI Mode (MCP Relay) for details.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors