Providers
OpenCode supports 75+ LLM providers through the AI SDK and Models.dev, enabling integration with numerous language model services and local models.
Setup Process
- Add API keys using the
/connectcommand - Configure the provider in your OpenCode config
- Credentials are stored in
~/.local/share/opencode/auth.json
Directory
Here's a quick reference of supported providers:
| Provider | Setup Method | Key Features |
|---|---|---|
| Anthropic | OAuth or API key | Claude Pro/Max support |
| OpenAI | ChatGPT Plus/Pro or API key | GPT-4o, o1 models |
| GitHub Copilot | Device code auth | Pro+ subscription models |
| Google Vertex AI | Service account or gcloud auth | 40+ models |
| Amazon Bedrock | AWS credentials/profile | VPC endpoint support |
| Azure OpenAI | API key + resource name | Custom deployments |
| Groq | API key | High-speed inference |
| DeepSeek | API key | Reasoning models |
| OpenRouter | API key | Multi-provider routing |
| GitLab Duo | API key | GitLab integration |
| Ollama | Local setup | Run models locally |
| LM Studio | Local setup | Local model management |
Additional providers include: 302.AI, Baseten, Cerebras, Cloudflare AI Gateway, Cortecs, Deep Infra, Firmware, Fireworks AI, Hugging Face, Helicone, IO.NET, Moonshot AI, MiniMax, Nebius Token Factory, OVHcloud AI Endpoints, SAP AI Core, Scaleway, Together AI, Venice AI, Vercel AI Gateway, xAI, Z.AI, ZenMux.
Base URL Configuration
You can customize the base URL for any provider by setting the baseURL option. This is useful when using proxy services or custom endpoints.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"anthropic": {
"options": {
"baseURL": "https://api.anthropic.com/v1"
}
}
}
}OpenCode Zen
OpenCode Zen is a list of models provided by the OpenCode team that have been tested and verified to work well.
- Run
/connect, select opencode - Visit opencode.ai/auth to authenticate
- Copy and paste your API key
- Use
/modelsto view recommended models
Popular Providers
Anthropic
- Run
/connectand select Anthropic - Choose Claude Pro/Max for browser authentication
- Access models via
/modelscommand
OpenAI
- Create API key at platform.openai.com/api-keys
- Run
/connectand search OpenAI - Enter API key
- Select model with
/models
Groq
Groq provides high-speed inference for various models.
- Create API key at console.groq.com
- Run
/connectand search Groq - Enter API key
- Select model with
/models
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"groq": {
"options": {
"apiKey": "{env:GROQ_API_KEY}"
}
}
}
}DeepSeek
DeepSeek offers powerful reasoning models.
- Create API key at platform.deepseek.com
- Run
/connectand search DeepSeek - Enter API key
- Select model with
/models
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"deepseek": {
"options": {
"apiKey": "{env:DEEPSEEK_API_KEY}"
}
}
}
}GitHub Copilot
GitHub Copilot integration requires a Pro+ subscription.
- Run
/connectand select GitHub Copilot - Complete device code authentication
- Access models via
/modelscommand
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"github-copilot": {
"models": {
"gpt-4o": {
"name": "GPT-4o (Copilot)"
}
}
}
}
}GitLab Duo
GitLab Duo provides AI features integrated with GitLab.
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"gitlab-duo": {
"options": {
"apiKey": "{env:GITLAB_API_KEY}"
}
}
}
}OpenRouter
{
"provider": {
"openrouter": {
"models": {
"moonshotai/kimi-k2": {
"options": {
"provider": {
"order": ["baseten"],
"allow_fallbacks": false
}
}
}
}
}
}
}Ollama (Local)
{
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama (local)",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama2": {
"name": "Llama 2"
}
}
}
}
}LM Studio (Local)
{
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://127.0.0.1:1234/v1"
},
"models": {
"google/gemma-3n-e4b": {
"name": "Gemma 3n-e4b (local)"
}
}
}
}
}Amazon Bedrock
{
"provider": {
"amazon-bedrock": {
"options": {
"region": "us-east-1",
"profile": "my-aws-profile"
}
}
}
}Authentication Precedence
When using Amazon Bedrock, authentication follows this precedence order:
- Bearer Token - If
AWS_BEARER_TOKEN_BEDROCKis set (via/connector environment variable), it takes precedence over all other methods - AWS Credential Chain - Standard AWS credential resolution:
- AWS profile configuration
- Access keys (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY) - IAM roles
- EKS IRSA (IAM Roles for Service Accounts)
Azure OpenAI
- Create Azure OpenAI resource in Azure portal
- Deploy model in Azure AI Foundry
- Run
/connectand search Azure - Set
AZURE_RESOURCE_NAMEenvironment variable
Custom Provider Setup
For OpenAI-compatible providers:
{
"provider": {
"myprovider": {
"npm": "@ai-sdk/openai-compatible",
"name": "My AI Provider",
"options": {
"baseURL": "https://api.myprovider.com/v1",
"apiKey": "{env:MY_API_KEY}"
},
"models": {
"my-model": {
"name": "My Model",
"limit": {
"context": 200000,
"output": 65536
}
}
}
}
}
}Environment Variable Syntax
Use the {env:VARIABLE_NAME} syntax to reference environment variables in your configuration:
{
"provider": {
"myprovider": {
"options": {
"apiKey": "{env:MY_PROVIDER_API_KEY}"
}
}
}
}This allows you to keep sensitive credentials out of your config files.
Model Limits
The limit fields help OpenCode understand the context window and output limits of your models:
{
"provider": {
"myprovider": {
"models": {
"my-model": {
"name": "My Model",
"limit": {
"context": 200000,
"output": 65536
}
}
}
}
}
}context: Maximum input tokens the model can processoutput: Maximum output tokens the model can generate
Custom Headers
You can add custom headers to API requests:
{
"provider": {
"myprovider": {
"options": {
"headers": {
"Authorization": "Bearer custom-token",
"X-Custom-Header": "value"
}
}
}
}
}Troubleshooting
- Check authentication: Run
opencode auth listto verify credentials - Custom provider issues:
- Verify provider ID matches between
/connectand config - Confirm correct npm package
- Check API endpoint in
options.baseURL
- Verify provider ID matches between