使用 OpenCode 模型,来自 Open WebUI、LangChain 或 OpenAI SDK — 一行安装

📄 中文摘要

OpenCode 提供了 GitHub Copilot、Ollama、Anthropic、Gemini 等多种模型的集中配置。然而,工作流中的其他工具,如 Open WebUI、LangChain、Chatbox、Continue、Zed 及自定义脚本,仍需单独输入相同模型的 API 密钥和基础 URL。为了解决这一问题,开发了 opencode-llm-proxy 插件,该插件在本地 HTTP 服务器上运行,并能够在不同 API 格式与 OpenCode 模型列表之间进行转换,简化了模型的使用流程。

📄 English Summary

Use any OpenCode model from Open WebUI, LangChain, or the OpenAI SDK — with a one-line install

OpenCode consolidates various models such as GitHub Copilot, Ollama, Anthropic, and Gemini in one place. However, other tools in the workflow, including Open WebUI, LangChain, Chatbox, Continue, Zed, and custom scripts, require the same models to be re-entered with their respective API keys and base URLs. To address this issue, the opencode-llm-proxy plugin was developed. It runs a local HTTP server and translates between the API formats used by different tools and the OpenCode model list, streamlining the model usage process.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等