AI Providers
OpenAlice supports three AI backends managed through a profile system: you define one or more named profiles in ai-provider-manager.json, and switch the active one at runtime without restarting. All backends share the same tool set and interface.
Claude Agent SDK
The default backend (backend: "agent-sdk"). Uses @anthropic-ai/claude-agent-sdk to call Claude.
Authentication methods:
- claudeai (default) — Uses your Claude Pro or Max subscription via the Claude Code CLI. No API key needed — just have Claude Code installed and authenticated.
- api-key — Direct Anthropic API key stored on the profile.
How tools work: An in-process MCP server is created from ToolCenter's registered tools. The Agent SDK consumes tools through this MCP server — the same protocol Claude Code uses.
Session handling: History is serialized as text (not structured messages). The AgentCenter builds a text prompt from session entries and sends it as a single string.
Vercel AI SDK
The multi-provider backend (backend: "vercel-ai-sdk"). Makes direct API calls using the Vercel AI SDK.
Supported model providers:
- Anthropic — Claude models (requires API key)
- OpenAI — GPT models (requires API key)
- Google — Gemini models (requires API key)
How tools work: Tools from ToolCenter are exported in Vercel AI SDK format. The provider runs an in-process tool loop.
Session handling: History is serialized as structured ModelMessage[] arrays (role-based messages with content blocks). This preserves tool_use and tool_result blocks natively.
Codex
The OpenAI Codex backend (backend: "codex"). Useful for GPT-family models when you prefer Codex's session model over the Vercel AI SDK.
Authentication methods:
- codex-oauth (default) — OAuth login flow
- api-key — Direct OpenAI API key stored on the profile
Configuration
The AI provider is configured in data/config/ai-provider-manager.json as a set of named profiles plus one activeProfile pointer:
{
"profiles": {
"default": {
"backend": "agent-sdk",
"model": "claude-sonnet-4-6",
"loginMethod": "claudeai"
},
"research-gpt": {
"backend": "vercel-ai-sdk",
"provider": "openai",
"model": "gpt-4o",
"apiKey": "sk-..."
}
},
"activeProfile": "default"
}
Each profile's shape depends on its backend:
| Backend | Required | Optional |
|---|---|---|
agent-sdk | model, loginMethod (claudeai / api-key) | apiKey, baseUrl |
vercel-ai-sdk | provider (anthropic / openai / google), model | apiKey, baseUrl |
codex | model, loginMethod (codex-oauth / api-key) | apiKey, baseUrl |
Hot-reload — The GenerateRouter re-reads this file on every call. Switch activeProfile and the very next message uses the new profile.
Migration — If you have a legacy flat ai-provider.json (or the older pair model.json + api-keys.json), OpenAlice migrates it into the profile format on first startup and deletes the legacy files.
Presets
The Web UI has a preset catalog that generates profile forms from JSON schemas — no manual JSON editing required. Each preset declares its backend, available models, endpoints, and which fields to mask as passwords.
| Preset | Backend | Notes |
|---|---|---|
| Claude (Subscription) | agent-sdk / claudeai | Uses your Claude Pro/Max via Claude Code CLI. Default. |
| Claude (API Key) | agent-sdk / api-key | Direct Anthropic API key. |
| Codex (Subscription) | codex / codex-oauth | OpenAI via ChatGPT subscription OAuth. |
| Codex (API Key) | codex / api-key | Direct OpenAI API key. |
| Google Gemini | vercel-ai-sdk / google | Gemini 2.5 Pro / Flash. |
| MiniMax | agent-sdk | Via Anthropic-compatible endpoint. Region-aware (China vs International). |
| GLM (Zhipu) | agent-sdk | Anthropic-compatible. Region-aware. |
| Kimi (Moonshot) | agent-sdk | Via Moonshot's Anthropic-compatible endpoint. Region-aware. |
| DeepSeek | agent-sdk | Via Anthropic-compatible endpoint. V4 Pro is the flagship; V4 Flash is the cheap/fast option. Single platform — no regional split. |
| Custom | any | Full control — pick backend, provider, model, endpoint manually. |
Region-aware endpoints. Chinese AI vendors with split deployments — MiniMax, GLM, and Kimi — have separate China and International consoles with different endpoints and region-locked API keys. The preset's endpoint picker gates you to a valid pair so you don't paste a China key into an International endpoint or vice versa. DeepSeek is not region-split (single platform at platform.deepseek.com), so the picker just shows the one endpoint.
Connection test before save. The profile create modal runs a live connection test against the chosen endpoint + model + key before committing the profile to disk. This catches bad keys, wrong regions, and model-name typos at creation time.
Inline model switcher. Once a profile exists, you can switch its model from the profile list without opening the full form — useful for flipping between Opus and Sonnet on the same Claude profile.
Runtime Switching
You can switch profiles in several ways:
- Web UI — Use the provider selector in the settings panel
- Edit config — Change
activeProfileinai-provider-manager.jsondirectly - Ask Alice — "Switch to the research-gpt profile" (if evolution mode is on, Alice can edit her own config)
The switch is immediate. No restart, no session loss.
Per-Channel Overrides
Web UI sub-channels can point at a different profile so you can run different models in different chat tabs:
[
{
"id": "research",
"label": "Research (GPT-4o)",
"profile": "research-gpt",
"systemPrompt": "You are a research assistant..."
}
]
Configure in data/config/web-subchannels.json. Each sub-channel can also override the system prompt and disable specific tools. If profile is omitted, the channel falls back to the global activeProfile.
Choosing a Backend
| Agent SDK | Vercel AI SDK | Codex | |
|---|---|---|---|
| Best for | Claude Pro/Max subscribers | API key users, multi-model | Codex/GPT users on OAuth |
| Auth | Claude Code login or API key | API keys per provider | Codex OAuth or API key |
| Models | Claude only | Anthropic, OpenAI, Google | OpenAI (GPT / Codex) |
| Tool delivery | In-process MCP server | Vercel native tools | In-process |
| Session format | Text history | Structured messages | Structured messages |
For most users, the default Agent SDK with Claude Code login is the simplest setup — no API keys, no configuration. Switch backends when you need a different model family.