Agent Settings
The agents.defaults section controls the core agent behavior.
{
"agents": {
"defaults": {
"workspace": "~/.kyber/workspace",
"provider": "openrouter",
"maxTokens": 8192,
"temperature": 0.7,
"timezone": ""
}
}
}Fields
| Field | Type | Default | Description |
|---|---|---|---|
workspace | string | ~/.kyber/workspace | Path to the agent’s workspace directory |
provider | string | openrouter | Default provider (see Providers) |
maxTokens | number | 8192 | Maximum tokens in the model response |
temperature | number | 0.7 | Sampling temperature (0.0 = deterministic, 1.0 = creative) |
timezone | string | "" | User timezone (e.g. America/New_York). Empty = system local time |
Model selection
Models are configured per-provider in ~/.kyber/config.json:
{
"providers": {
"openrouter": {
"model": "anthropic/claude-sonnet-4-20250514"
}
}
}In the dashboard, each provider card shows a dropdown that fetches available models directly from the provider’s API after you enter your API key.
Examples:
| Provider | Example model |
|---|---|
| OpenRouter | anthropic/claude-sonnet-4-20250514, google/gemini-3-flash-preview |
| Anthropic | claude-sonnet-4-20250514 |
| OpenAI | gpt-4o |
| Gemini | gemini-2.5-flash |
| DeepSeek | deepseek-chat |
| Groq | llama-3.3-70b-versatile |
Custom providers
Any OpenAI-compatible endpoint can be added as a custom provider:
{
"providers": {
"custom": [
{
"name": "my-local",
"apiBase": "http://localhost:11434/v1",
"apiKey": "",
"model": "llama3"
}
]
}
}The API key for custom providers can be set in .env:
# For custom provider named "my-local"
KYBER_CUSTOM_PROVIDER_MY_LOCAL_API_KEY=your-key