324 日 , 2026 17:24:32
gemini-cli用其他模型

虽然官方明确说了不会支持其他模型,但利用lite-llm这个转换器(类似命令行版的new-api),再使用环境变量让cli连接转换出来的api。

GOOGLE_GEMINI_BASE_URL=http://localhost:4002
GEMINI_API_KEY=sk-litellm-local

lite-llm配置如下,将openai格式调用转换成gemini格式给cli用

.openai_shared: &openai_shared
  api_base: 其他模型提供商url
  api_key: 密钥
  custom_llm_provider: openai
  timeout: 60

model_list:
  - model_name: qwen3.5-plus
    litellm_params:
      <<: *openai_shared
      model: qwen3.5-plus

router_settings:
  num_retries: 4
  model_group_alias:
    "gemini-3.1-pro-preview-customtools": "qwen3.5-plus"
    "gemini-3-flash-preview-customtools": "qwen3.5-plus"
    "gemini-3-flash-preview": "qwen3.5-plus"
    "gemini-3.1-flash-lite-preview-customtools": "qwen3.5-plus"
    "gemini-2.5-pro": "qwen3.5-plus"
    "gemini-2.5-flash": "qwen3.5-plus"
    "gemini-2.5-flash-lite": "qwen3.5-plus"
    "gemini-3-pro-preview": "qwen3.5-plus"
    "qwen3.5-plus-customtools": "qwen3.5-plus"

general_settings:
  master_key: sk-litellm-local

 

 

暂无评论

发送评论 编辑评论