Providers

Kyber supports multiple LLM providers through direct API integration. You can pin a specific provider so it won’t fall back to another key when multiple are configured.

Supported providers

ProviderConfig keyEnv var for API keyGet a key
OpenRouteropenrouterKYBER_PROVIDERS__OPENROUTER__API_KEYopenrouter.ai/keys
AnthropicanthropicKYBER_PROVIDERS__ANTHROPIC__API_KEYconsole.anthropic.com
OpenAIopenaiKYBER_PROVIDERS__OPENAI__API_KEYplatform.openai.com
Google GeminigeminiKYBER_PROVIDERS__GEMINI__API_KEYaistudio.google.com
DeepSeekdeepseekKYBER_PROVIDERS__DEEPSEEK__API_KEYplatform.deepseek.com
GroqgroqKYBER_PROVIDERS__GROQ__API_KEYconsole.groq.com

Any OpenAI-compatible endpoint (Ollama, vLLM, Together, etc.) can be added as a custom provider.

Configuration

API keys go in ~/.kyber/.env. Provider settings (model, base URL) go in ~/.kyber/config.json:

# ~/.kyber/.env
KYBER_PROVIDERS__OPENROUTER__API_KEY=sk-or-v1-your-key-here
{
  "agents": {
    "defaults": {
      "provider": "openrouter"
    }
  },
  "providers": {
    "openrouter": {
      "chatModel": "google/gemini-3-flash-preview",
      "taskModel": "google/gemini-3-flash-preview"
    }
  }
}

Chat vs task models

Each provider supports separate models for different roles:

FieldPurpose
chatModelUsed for conversational replies
taskModelUsed for background workers/tasks
modelLegacy fallback if chatModel/taskModel aren’t set

You can also use entirely different providers for chat and tasks:

{
  "agents": {
    "defaults": {
      "chatProvider": "openrouter",
      "taskProvider": "deepseek"
    }
  }
}

If chatProvider or taskProvider is empty, both fall back to the main provider setting.

OpenRouter gives you access to models from every major provider through a single API key. This is the default and easiest option.

# ~/.kyber/.env
KYBER_PROVIDERS__OPENROUTER__API_KEY=sk-or-v1-xxx
{
  "agents": {
    "defaults": {
      "provider": "openrouter"
    }
  },
  "providers": {
    "openrouter": {
      "chatModel": "anthropic/claude-sonnet-4-20250514"
    }
  }
}

You can use any model available on OpenRouter by setting the model field to the OpenRouter model ID.

Using a provider directly

To use a provider’s API directly (lower latency, no middleman):

# ~/.kyber/.env
KYBER_PROVIDERS__DEEPSEEK__API_KEY=sk-xxx
{
  "agents": {
    "defaults": {
      "provider": "deepseek"
    }
  },
  "providers": {
    "deepseek": {
      "chatModel": "deepseek-chat"
    }
  }
}

Custom providers (Ollama, vLLM, etc.)

Any OpenAI-compatible endpoint can be added as a custom provider. This works for Ollama, vLLM, Together, or any self-hosted model server:

{
  "agents": {
    "defaults": {
      "provider": "my-local"
    }
  },
  "providers": {
    "custom": [
      {
        "name": "my-local",
        "apiBase": "http://localhost:11434/v1",
        "chatModel": "llama3"
      }
    ]
  }
}

For custom providers that need an API key, add it to .env:

KYBER_CUSTOM_PROVIDER_MY_LOCAL_API_KEY=your-key

Custom providers can also be added directly from the dashboard UI.

Provider fallback

When provider is set explicitly, Kyber only uses that provider’s API key. If you leave provider empty, Kyber checks keys in this order: OpenRouter → DeepSeek → Anthropic → OpenAI → Gemini → Groq, and uses the first one it finds.

Retries

The provider handles automatic retries with exponential backoff for transient errors like rate limits, timeouts, and malformed upstream responses.