Blog
Tutorial·7 min·

Smart LLM Routing: Using OpenRouter with Mithril for Cost-Optimized AI

How to build agents that automatically pick the cheapest or fastest LLM for each task, paid per-call through x402.

Not every agent task needs the most expensive LLM. A simple classification might use Haiku, while a complex analysis needs Opus. OpenRouter + Mithril lets your agent make this choice dynamically — and pay per-call for only what it uses.

Why LLM Routing Matters

LLM costs vary 100x:

  • Claude Haiku: ~$0.001 per call
  • Claude Sonnet: ~$0.01 per call
  • Claude Opus: ~$0.05 per call
  • GPT-4o: ~$0.02 per call
  • An agent that uses Opus for everything costs 50x more than one that routes intelligently.

    OpenRouter: The Universal LLM API

    OpenRouter provides a single API endpoint that routes to any LLM. Instead of integrating with Anthropic, OpenAI, Google, and Meta separately, you call OpenRouter and specify the model.

    With x402, your agent doesn't even need an OpenRouter API key. It just pays per-call.

    Routing Strategy

    function selectModel(task: string): string {
      // Simple classification/extraction
      if (task === "classify" || task === "extract")
        return "anthropic/claude-haiku-4-5"
    
      // Standard analysis/writing
      if (task === "analyze" || task === "write")
        return "anthropic/claude-sonnet-4-6"
    
      // Complex reasoning/planning
      if (task === "plan" || task === "reason")
        return "anthropic/claude-opus-4-6"
    
      // Default to cost-effective
      return "anthropic/claude-sonnet-4-6"
    }

    Implementation

    async function callLLM(task: string, prompt: string) {
      const model = selectModel(task)
    
      const result = await m.pay({
        url: "https://openrouter.ai/api/v1/chat/completions",
        method: "POST",
        body: JSON.stringify({
          model,
          messages: [{ role: "user", content: prompt }],
        }),
      })
    
      return result.data.choices[0].message.content
    }

    Cost Impact

    A research agent that makes 100 LLM calls/day:

    StrategyDaily Cost

    |----------|-----------|

    Always Opus$5.00
    Smart routing$0.40

    Smart routing saves 60-92% on LLM costs by matching model capability to task complexity.

    Advanced: Dynamic Model Selection

    Let the agent itself decide which model to use:

    async function smartCall(prompt: string) {
      // Use a cheap model to classify the task
      const classification = await callLLM(
        "classify",
        `Classify this task as simple/medium/complex: ${prompt}`
      )
    
      const model = classification.includes("complex")
        ? "anthropic/claude-opus-4-6"
        : classification.includes("medium")
        ? "anthropic/claude-sonnet-4-6"
        : "anthropic/claude-haiku-4-5"
    
      return callLLM("analyze", prompt)
    }

    The classification call costs ~$0.001 and can save $0.05+ on every subsequent call.

    Monitoring

    Track model usage in the Mithril dashboard:

  • Cost per model
  • Calls per model
  • Average cost per agent task
  • Model selection distribution