如何使用 Docker 和 Ollama 设置 NadirClaw 实现零成本本地 LLM 路由

📄 中文摘要

NadirClaw 是一款 LLM 路由器,旨在优化 AI 编程工具的使用,降低 API 成本。通过对提示进行分类,NadirClaw 将简单请求路由到廉价或免费的模型,而将复杂请求路由到高级模型,从而实现 40-70% 的 API 费用降低。该教程提供了在 Docker 中与 Ollama 一起运行 NadirClaw 的步骤,确保完全免费的本地路由,无需 API 密钥、费用或外部依赖。完成后,用户将能够在 Docker 中运行 NadirClaw,作为一个兼容 OpenAI 的代理,并在本地使用免费的模型,如 Llama 和 Qwen。

📄 English Summary

How to Set Up NadirClaw with Docker + Ollama for Zero-Cost Local LLM Routing

NadirClaw is an LLM router designed to optimize the use of AI coding tools and reduce API costs. By classifying prompts, NadirClaw routes simple requests to cheap or free models, while directing complex requests to premium models, achieving a 40-70% reduction in API expenses. This tutorial provides steps to run NadirClaw with Ollama in Docker for completely free local routing, requiring no API keys, costs, or external dependencies. By the end of the guide, users will have NadirClaw running in Docker as an OpenAI-compatible proxy and will be able to use free models locally, such as Llama and Qwen.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等