Skip to content

Proposal: Support poe.com as an LLM provider #6594

@PeterDaveHello

Description

@PeterDaveHello

What specific problem does this solve?

Poe provides an API that’s fully compatible with the OpenAI Chat Completions format. If Roo-Code supports Poe as a provider, users can simply swap the base URL and API key to access a large selection of models and bots. Poe covers leading models like GPT, Claude, Gemini, Llama, Mistral, Grok, Kimi, GLM, and more. This will make it easier for users to switch between models/providers without changing their workflow or code, and reduce the hassle of juggling multiple provider accounts or keys.

Additional context (optional)

For reference, Poe’s API documentation is here: https://creator.poe.com/docs/external-applications/openai-compatible-api

Roo Code Task Links (Optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear impact and context

Interested in implementing this?

  • Yes, I'd like to help implement this feature

Implementation requirements

  • I understand this needs approval before implementation begins

How should this be solved? (REQUIRED if contributing, optional otherwise)

No response

How will we know it works? (Acceptance Criteria - REQUIRED if contributing, optional otherwise)

No response

Technical considerations (REQUIRED if contributing, optional otherwise)

No response

Trade-offs and risks (REQUIRED if contributing, optional otherwise)

No response

Metadata

Metadata

Labels

EnhancementNew feature or requestIssue - In ProgressSomeone is actively working on this. Should link to a PR soon.proposal

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions