Releases: inference-gateway/inference-gateway
🚀 Version 0.23.5
0.23.5 (2026-04-01)
🔧 Miscellaneous
- Add stale issues workflow to auto-close inactive issues (0989d22)
- Bump CI and dev containers dependencies (#255) (aca7a05)
- Change assignees from 'maintainers' to 'core' (#260) (bdc1cf2)
- deps: Bump claude code to 2.1.70 (c2d8fa4)
- deps: Bump express-rate-limit (#253) (d778280)
- deps: Bump github.com/buger/jsonparser (#258) (17d0db4)
- deps: Bump github.com/buger/jsonparser from 1.1.1 to 1.1.2 (#259) (936a7a8)
- deps: Bump hono (#254) (70d193f)
- deps: Bump the examples-go group across 1 directory with 3 updates (#256) (5535384)
- deps: Bump the examples-npm group (#257) (e9d662f)
- Remove include scope from dependabot (326771b)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.5 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.5Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.23.4
0.23.4 (2026-03-05)
🐛 Bug Fixes
- Do not run infer agent if the comment was made by dependabot (2ea9aa9)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.4 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.4Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.23.3
0.23.3 (2026-03-05)
♻️ Improvements
🔧 Miscellaneous
- deps)(deps: Bump go.opentelemetry.io/otel/sdk (#242) (5660f71)
- deps: Bump @hono/node-server (#245) (379e1b9)
- deps: Bump @modelcontextprotocol/sdk (#235) (987087e)
- deps: Bump ajv in /examples/docker-compose/mcp/pizza-server (#240) (9a24b04)
- deps: bump dev tool and CI action versions (#247) (0deb374)
- deps: Bump hono in /examples/docker-compose/mcp/pizza-server (#234) (ad5bd93)
- deps: Bump hono in /examples/docker-compose/mcp/pizza-server (#239) (289f2d1)
- deps: Bump hono in /examples/docker-compose/mcp/pizza-server (#241) (bdfb283)
- deps: Bump hono in /examples/docker-compose/mcp/pizza-server (#244) (cbad88f)
- deps: Bump qs in /examples/docker-compose/mcp/pizza-server (#238) (df33864)
- deps: Bump the examples-go group across 7 directories with 5 updates (#250) (11f16ec)
- deps: Bump the examples-npm group (#249) (42ae2f5)
- deps: Reduce dependabot noise for examples dependencies (#248) (0611eaa)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.3 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.3Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.23.2
0.23.2 (2026-01-23)
🐛 Bug Fixes
- examples: Add missing ghcr.io prefix to oci images (02a84b8)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.2 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.2Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.23.1
0.23.1 (2026-01-23)
♻️ Improvements
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.1 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.1Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.23.0
0.23.0 (2026-01-22)
✨ Features
🔧 Miscellaneous
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.23.0 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.23.0Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.22.10
0.22.10 (2026-01-07)
🔧 Miscellaneous
- deps: Update qs to 6.14.1 to resolve security vulnerability (6f67514)
- deps: Bump @modelcontextprotocol/sdk (#226) (e0802fd)
- deps: Update infer CLI - update config and add shortcut files (7dd101b)
- Update dependency versions (#227) (b570fec)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.22.10 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.22.10Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.22.9
0.22.9 (2025-12-14)
🐛 Bug Fixes
- install: Prefix the path with the INSTALL_DIR variable (78ed26b)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.22.9 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.22.9Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.22.8
0.22.8 (2025-12-12)
♻️ Improvements
👷 CI
🔧 Miscellaneous
- deps: Bump claude code to its latest for development (9c4a7ee)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.22.8 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.22.8Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.
🚀 Version 0.22.7
0.22.7 (2025-12-11)
📚 Documentation
- Add AGENTS.md for AI agent guidance (91c2131)
🔧 Miscellaneous
- deps: Bump infer CLI version to its latest (df3f040)
- deps: Bump semantic-release to its latest version (a3405b5)
📦 Quick Installation
Binary Installation
Install latest version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | bashInstall this version:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | VERSION=v0.22.7 bashInstall to custom directory:
curl -fsSL https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/install.sh | INSTALL_DIR=~/.local/bin bashVerify installation:
inference-gateway --versionRunning as a Container (Recommended)
Using Docker:
docker run -d \
-p 8080:8080 \
-e OPENAI_API_KEY=your-api-key \
ghcr.io/inference-gateway/inference-gateway:0.22.7Using Docker Compose:
# Download example configuration
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/docker-compose.yml
curl -O https://raw.githubusercontent.com/inference-gateway/inference-gateway/main/examples/docker-compose/basic/.env.example
mv .env.example .env
# Edit .env with your API keys
# Start the gateway
docker compose up -dFor more installation options and documentation, visit the README.