Every model provider is declared under [providers.models.<name>] in ~/.zeroclaw/config.toml. The <name> is your own alias β it's how you reference the provider elsewhere in config (default_model = "claude", fallback_providers = ["claude", "local"], etc.).
[providers.models.<name>]
kind = "<provider-kind>" # required β selects the implementation
model = "<model-id>" # required β passed to the providerAlmost every provider also takes:
api_key = "..." # or an env-var placeholder
base_url = "https://..." # for OpenAI-compatible or self-hosted endpointskind |
Implementation | Notes |
|---|---|---|
anthropic |
crates/zeroclaw-providers/src/anthropic.rs |
Accepts OAuth tokens (sk-ant-oat*) or API keys |
openai |
openai.rs |
GPT, o-series |
ollama |
ollama.rs |
Native /api/chat. Supports structured output via format |
openai-compatible |
compatible.rs |
One impl for ~20 providers; set base_url and optionally api_key |
bedrock |
bedrock.rs |
Uses AWS credentials chain (env, IAM role, profile) |
gemini |
gemini.rs |
|
gemini-cli |
gemini_cli.rs |
Shells out to gemini CLI; no API key needed |
azure-openai |
azure_openai.rs |
Takes base_url + api_version + deployment |
copilot |
copilot.rs |
OAuth flow built in |
openrouter |
openrouter.rs |
Multi-vendor routing layer |
claude-code |
claude_code.rs |
Delegates to a Claude Code session via MCP |
telnyx |
telnyx.rs |
Voice AI via Telnyx |
kilocli |
kilocli.rs |
Local KiloCLI inference |
reliable |
reliable.rs |
Fallback-chain wrapper β see Fallback & routing |
router |
router.rs |
Task-hint router β see Fallback & routing |
Four ways to supply credentials, in resolution order:
- Inline
api_key = "..."in the config entry (fine for dev, risky for checked-in configs) - Config-level secrets store β encrypted at
~/.zeroclaw/secretsvia a local key file - Provider-specific env var β
ANTHROPIC_API_KEY,ANTHROPIC_OAUTH_TOKEN,OPENAI_API_KEY,OPENROUTER_API_KEY,GROQ_API_KEY, etc. - Generic fallback β
ZEROCLAW_API_KEY,API_KEY
The onboarding wizard writes credentials to the secrets store by default. Config files you commit should use neither inline keys nor env_passthrough entries that leak user keys.
Several providers support OAuth / subscription-style tokens instead of raw API keys:
- Anthropic β
sk-ant-oat-*OAuth tokens work anywhere an API key does. No cost if you're on a Pro/Team plan. - GitHub Copilot β authenticate via the onboarding wizard (
zeroclaw onboard) which handles the OAuth flow. The token is stored in the secrets backend. - Gemini CLI β uses the
geminiCLI's existing auth. - Claude Code β uses your Claude Code login.
The onboarding wizard detects Docker/Podman/Kubernetes and rewrites localhost to container-appropriate hostnames:
[providers.models.local]
kind = "ollama"
base_url = "http://host.docker.internal:11434" # was "http://localhost:11434" on hostYou can also force this manually at runtime:
ZEROCLAW_OLLAMA_BASE_URL=http://ollama:11434 zeroclaw agentBeyond the universal fields, some providers accept extras. Highlights:
[providers.models.local]
kind = "ollama"
base_url = "http://localhost:11434"
model = "qwen3.6:35b-a3b"
think = false # disable reasoning mode for faster output
reasoning_effort = "none" # same intent, passed as a top-level field
options = { temperature = 0, num_ctx = 32768 }[providers.models.groq]
kind = "openai-compatible"
base_url = "https://api.groq.com/openai"
model = "llama-3.3-70b-versatile"
api_key = "gsk_..."
# Optional β supplies SSE tool-call streaming hints the endpoint understands
native_tool_streaming = true[providers.models.azure]
kind = "azure-openai"
base_url = "https://my-resource.openai.azure.com"
deployment = "gpt-4o"
api_version = "2024-10-01-preview"
api_key = "..."default_provider = "claude"
default_model = "claude-haiku-4-5-20251001"Both are read at agent startup. Channels, tools, and SOPs can override per-request.
- Overview
- Provider catalog β concrete config examples for every provider
- Streaming
- Fallback & routing