Why You Need This
Factory Droid is a fast, agent-native coding tool that embeds AI into your IDE, terminal, CLI, Slack, and project manager. But it expects API keys for Anthropic and OpenAI — while Claude Max and OpenAI Codex subscriptions use OAuth tokens.
CLIProxyAPI bridges this gap. It runs a local proxy that accepts API key requests from Factory Droid and converts them to OAuth-authenticated requests. One proxy handles both Claude and Codex simultaneously.
Both Claude and Codex work through the same proxy at the same time. Switch models in Factory with /model — no restart needed.
Install CLIProxyAPI
On macOS, it’s a two-liner:
brew install cliproxyapi
brew services start cliproxyapiThat spins up a local proxy on http://localhost:8317 with automatic token rotation and credential pooling.
The default config lives at $(brew --prefix)/etc/cliproxyapi.conf (typically /opt/homebrew/etc/cliproxyapi.conf on Apple Silicon).
Authenticate with Claude and Codex
Log in to each provider:
cliproxyapi login claude
cliproxyapi login codexEach command opens your browser for OAuth. Tokens are saved to ~/.cli-proxy-api/ and the proxy picks them up automatically.
Check Available Models
Before configuring Factory, see what your proxy actually serves. What you see depends on which providers you’ve logged into:
curl -s -H "Authorization: Bearer your-api-key-1" \
http://localhost:8317/v1/models | python3 -c \
"import sys,json; data=json.load(sys.stdin); \
[print(m['id']) for m in sorted(data['data'], key=lambda x: x['id'])]"Only add models that appear in this list to your Factory config.
Configure Factory Droid
Factory reads custom models from ~/.factory/settings.json under a customModels array. Create or edit this file:
Use ~/.factory/settings.json with camelCase fields (customModels, baseUrl, apiKey). The legacy ~/.factory/config.json with snake_case fields (custom_models, base_url, api_key) still works but takes lower priority and doesn’t support environment variable expansion. Remove config.json if you have both.
Minimal Configuration
{
"customModels": [
{
"model": "claude-opus-4-7",
"displayName": "Claude Opus 4.7",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "gpt-5.5",
"displayName": "GPT-5.5",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "openai"
}
]
}The apiKey must match one of the keys in your proxy’s api-keys list in cliproxyapi.conf. The default config ships with your-api-key-1, your-api-key-2, and your-api-key-3. You can also use environment variable syntax: "apiKey": "${PROVIDER_API_KEY}".
Full Configuration with Latest Models
Here’s a complete settings.json with all current Claude and OpenAI models:
{
"customModels": [
{
"model": "claude-opus-4-7",
"displayName": "Claude Opus 4.7",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "claude-sonnet-4-6",
"displayName": "Claude Sonnet 4.6",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "claude-opus-4-6",
"displayName": "Claude Opus 4.6",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "claude-haiku-4-5-20251001",
"displayName": "Claude Haiku 4.5",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "claude-sonnet-4-5-20250929",
"displayName": "Claude Sonnet 4.5",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "claude-opus-4-5-20251101",
"displayName": "Claude Opus 4.5",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "anthropic"
},
{
"model": "gpt-5.5",
"displayName": "GPT-5.5",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "openai"
},
{
"model": "gpt-5.4",
"displayName": "GPT-5.4",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "openai"
},
{
"model": "gpt-5.4-mini",
"displayName": "GPT-5.4 Mini",
"baseUrl": "http://localhost:8317/v1",
"apiKey": "your-api-key-1",
"provider": "openai"
}
]
}Provider Types
The provider field determines which API format Factory uses to talk to the proxy:
| Provider | API Format | Use For |
|---|---|---|
anthropic | Anthropic Messages API (/v1/messages) | Claude models |
openai | OpenAI Responses API (/v1/responses) | GPT models |
generic-chat-completion-api | OpenAI Chat Completions (/v1/chat/completions) | OpenRouter, Ollama, vLLM, etc. |
CLIProxyAPI supports all three formats and routes based on the model name regardless of which provider type Factory uses.
Using Your Models
Factory detects settings.json changes automatically via file watching. Switch models with:
/modelYour custom models appear in a separate “Custom models” section. Pick one and start coding.
To verify the proxy is responding correctly:
# Claude
curl -s -H "x-api-key: your-api-key-1" \
http://localhost:8317/v1/messages \
-H "Content-Type: application/json" \
-d '{"model":"claude-opus-4-7","messages":[{"role":"user","content":"hi"}],"max_tokens":10}'
# OpenAI
curl -s -H "Authorization: Bearer your-api-key-1" \
http://localhost:8317/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"gpt-5.5","messages":[{"role":"user","content":"hi"}],"max_tokens":10}'Updating to New Models
When new Claude or OpenAI models launch, you don’t need to rebuild or reinstall anything:
- Make sure CLIProxyAPI is up to date:
brew upgrade cliproxyapi - Restart the proxy:
brew services restart cliproxyapi - Check available models with the
/v1/modelsendpoint - Add the new model to your
settings.json
Factory picks up config changes automatically — no restart needed on the Droid side either.
Troubleshooting
”BYOK Error: 404”
Check these in order:
- Proxy running? —
brew services list | grep cliproxyapi - API key matches? — The
apiKeyinsettings.jsonmust match an entry in your proxy’sapi-keyslist incliproxyapi.conf - baseUrl has /v1? — Must be
http://localhost:8317/v1, nothttp://localhost:8317 - Model available? — Query
/v1/modelsto verify the model exists on your proxy - Correct config file? — Use
settings.json(camelCase), not the legacyconfig.json(snake_case) - Correct provider? —
anthropicfor Claude,openaifor GPT
OAuth token expired
Re-authenticate:
cliproxyapi login claude
cliproxyapi login codexTokens are saved to ~/.cli-proxy-api/ and picked up automatically.
Model not appearing in Factory selector
Check JSON syntax in settings.json. Factory watches this file and should detect changes within seconds. Verify all required fields are present: model, baseUrl, apiKey, and provider.
”Invalid API key” from proxy
The proxy validates every request against its api-keys list. Make sure the apiKey in your Factory config matches exactly. You can edit the list in cliproxyapi.conf if needed.
Related Articles
happymode: a tiny macOS menu bar app that keeps Dark Mode on schedule
A behind-the-scenes look at happymode: a native macOS menu bar utility that switches system appearance using sunrise/sunset (no external APIs) or custom daily times, with a weekly preview and permission-aware UX.
postgresRecreating PlanetScale's pg_strict in Rust: A Build Log
A detailed build log of cloning PlanetScale's pg_strict in Rust. From the pitfalls of sqlparser and executor hooks to the zero-overhead solution using Postgres' native post_parse_analyze_hook.