Top
Best
New

Posted by vickyfu 1 day ago

Show HN: ClawRouter – Open-source LLM router that saves 78% on inference costs(github.com)
11 points | 1 comments
vickyfu 1 day ago
Hey HN, I built ClawRouter because I was spending $200+/month on LLM API calls and realized most of my requests were simple enough for cheap models.

ClawRouter sits between your app and 30+ LLM providers (OpenAI, Anthropic, Google, DeepSeek, xAI). For each request, it classifies the query complexity and routes to the cheapest model that can handle it.

How it works: - 14-dimension weighted scoring (code detection, reasoning markers, length, etc.) - 4 tiers: SIMPLE → DeepSeek ($0.27/M) | MEDIUM → GPT-4o-mini | COMPLEX → Claude Sonnet | REASONING → o3 - Routing runs 100% locally in <1ms — zero external API calls for routing - Payment via x402 USDC micropayments on Base (no API keys needed)

The cost savings come from the fact that ~60% of typical LLM traffic is simple Q&A, summarization, or formatting that doesn't need a frontier model.

Built as an OpenClaw plugin, but works standalone too. MIT licensed.

npm: @blockrun/clawrouter

Happy to answer questions about the routing algorithm or the x402 payment approach.