模型配置指南
OpenClaw 支持多种 AI 模型 Provider,全部在 openclaw.json 的 models 字段配置。
1. 内置 Provider 一览
| Provider 名 | 系列 | 是否需要代理 |
|---|---|---|
anthropic | Claude | 国内需代理 |
openai | GPT | 国内需代理 |
openrouter | 多模型聚合 | 国内需代理 |
deepseek | DeepSeek | 国内直连 |
moonshot | Kimi | 国内直连 |
minimax | MiniMax | 国内直连 |
ollama | 本地模型 | 不需要 |
2. 国内模型配置示例
千问(阿里云百炼)
json
{"models": {"mode": "merge","providers": {"qwen": {"baseUrl": "https://dashscope.aliyuncs.com/compatible-mode/v1","apiKey": "${QWEN_API_KEY}","api": "openai-completions","models": [{ "id": "qwen-max", "name": "千问Max", "contextWindow": 32000, "maxTokens": 8000 },{ "id": "qwen-plus", "name": "千问Plus", "contextWindow": 131072, "maxTokens": 8000 }]}}}}
bash
openclaw env set QWEN_API_KEY sk-xxx
豆包(火山引擎)
json
{"models": {"mode": "merge","providers": {"doubao": {"baseUrl": "https://ark.cn-beijing.volces.com/api/v3","apiKey": "${DOUBAO_API_KEY}","api": "openai-completions","models": [{ "id": "doubao-pro-32k", "name": "豆包Pro-32k", "contextWindow": 32000, "maxTokens": 4096 },{ "id": "doubao-pro-4k", "name": "豆包Pro-4k", "contextWindow": 4000, "maxTokens": 2048 }]}}}}
bash
openclaw env set DOUBAO_API_KEY xxx
Moonshot(Kimi)
json
{"models": {"mode": "merge","providers": {"moonshot": {"baseUrl": "https://api.moonshot.cn/v1","apiKey": "${MOONSHOT_API_KEY}","api": "openai-completions","models": [{ "id": "moonshot-v1-8k", "name": "Kimi-8k", "contextWindow": 8000, "maxTokens": 4096 },{ "id": "moonshot-v1-128k", "name": "Kimi-128k", "contextWindow": 128000, "maxTokens": 8192 }]}}}}
DeepSeek
json
{"models": {"mode": "merge","providers": {"deepseek": {"baseUrl": "https://api.deepseek.com/v1","apiKey": "${DEEPSEEK_API_KEY}","api": "openai-completions","models": [{ "id": "deepseek-chat", "name": "DeepSeek V3", "contextWindow": 64000, "maxTokens": 8192 },{ "id": "deepseek-reasoner", "name": "DeepSeek R1", "contextWindow": 64000, "maxTokens": 8192 }]}}}}
`api` 字段说明:大多数国内服务用openai-completions。OpenAI 原生 Responses API 才用openai-responses,不要混用。
3. 别名 + 主模型 + 备用链
json
{"agents": {"defaults": {"models": {"anthropic/claude-sonnet-4-5": { "alias": "sonnet" },"deepseek/deepseek-chat": { "alias": "ds" },"qwen/qwen-max": { "alias": "qwen" },"doubao/doubao-pro-32k": { "alias": "db" }},"model": {"primary": "anthropic/claude-sonnet-4-5","fallbacks": ["deepseek/deepseek-chat", "qwen/qwen-max"]}}}}
对话中快速切换:
bash
/model ds # 切到 DeepSeek(节省成本)/model qwen # 切到千问/model sonnet # 切回 Claude
4. 查看消耗 & 成本追踪
bash
openclaw status # 查看当前使用的模型openclaw dashboard # 详细 Token 用量统计
回复消息前缀显示使用的模型,方便核对:
json
{ "messages": { "responsePrefix": "[{model}] " } }
5. 模型选择建议
| 场景 | 推荐模型 | 原因 |
|---|---|---|
| 日常对话 / 写作 | claude-sonnet-4-5 | 效果好,速度快 |
| 复杂推理 / 代码 | claude-opus-4-6 | 最强效果 |
| 高频自动化任务 | deepseek-chat 或 豆包 | 成本低,国内直连 |
| 长文档处理 | moonshot-v1-128k | 128k 上下文 |
| 离线 / 私有化 | ollama 本地模型 | 数据不出本地 |