-
Notifications
You must be signed in to change notification settings - Fork 614
Description
Feature Request:
Is your feature request related to a problem? Please describe.
The application currently lacks support for the ModelScope API provider. ModelScope is a major model-as-a-service platform from Alibaba Cloud, offering a vast collection of models. Not supporting it excludes a significant ecosystem and prevents users from accessing its unique and powerful models.
Describe the solution you'd like
I would like to request the addition of ModelScope as a new provider. ModelScope provides an OpenAI-compatible API endpoint, which should facilitate a relatively simple integration. The key information for the provider would be:
- Provider Name: ModelScope
- API Base URL:
https://api-inference.modelscope.cn/v1 - Authentication: Bearer Token using an API Key from DashScope (
DASHSCOPE_API_KEY).
Describe alternatives you've considered
Using other providers, which do not offer the specific models, especially Chinese-language models, that are available through ModelScope.
Additional context
The official documentation page is on the Alibaba Cloud website. Accessing the models is done via the DashScope (灵积) service.
Below is a sample cURL command demonstrating how to call their OpenAI-compatible chat completions endpoint:
curl -X POST https://api-inference.modelscope.cn/v1/chat/completions \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen-plus",
"messages": [{
"role": "user",
"content": "Explain the importance of fast language models"
}]
}'功能请求:添加对 ModelScope 的支持
你的功能请求是否与某个问题有关?请描述一下。
当前应用不支持 ModelScope 作为 LLM provider。ModelScope (魔搭) 是阿里云旗下一个重要的模型即服务平台,提供了海量的模型。不支持它意味着排除了一个重要的模型生态,使用户无法访问其独特且强大的模型。
请描述你希望的解决方案
希望能支持添加 ModelScope 作为一个新的 Provider 选项。ModelScope 提供了与 OpenAI 兼容的 API 接口,因此集成应可参照现有实现。关键配置信息如下:
- Provider 名称: ModelScope
- API Base URL:
https://api-inference.modelscope.cn/v1 - 认证方式: 使用来自阿里云灵积服务的 API Key (
DASHSCOPE_API_KEY) 进行 Bearer Token 认证。
请描述你考虑过的其他替代方案
使用其他 Provider,但它们无法提供 ModelScope 上的特定模型,尤其是在中文语言模型方面的优势。
附加背景
该服务的官方文档位于阿里云网站,通过其灵积(DashScope)服务进行模型调用。
下方是一个调用其与 OpenAI 兼容的 chat/completions 接口的 cURL 请求示例:
curl -X POST https://api-inference.modelscope.cn/v1/chat/completions \
-H "Authorization: Bearer $DASHSCOPE_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "qwen-plus",
"messages": [{
"role": "user",
"content": "解释一下为什么快速的语言模型很重要"
}]
}'