文档
English Docs
Providers

Providers

OpenCode supports 75+ LLM providers through the AI SDK and Models.dev, enabling integration with numerous language model services and local models.

Setup Process

  1. Add API keys using the /connect command
  2. Configure the provider in your OpenCode config
  3. Credentials are stored in ~/.local/share/opencode/auth.json

Base URL Configuration

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "anthropic": {
      "options": {
        "baseURL": "https://api.anthropic.com/v1"
      }
    }
  }
}

OpenCode Zen

OpenCode Zen is a list of models provided by the OpenCode team that have been tested and verified to work well.

  1. Run /connect, select opencode
  2. Visit opencode.ai/auth to authenticate
  3. Copy and paste your API key
  4. Use /models to view recommended models

Popular Providers

Anthropic

  1. Run /connect and select Anthropic
  2. Choose Claude Pro/Max for browser authentication
  3. Access models via /models command

OpenAI

  1. Create API key at platform.openai.com/api-keys
  2. Run /connect and search OpenAI
  3. Enter API key
  4. Select model with /models

OpenRouter

{
  "provider": {
    "openrouter": {
      "models": {
        "moonshotai/kimi-k2": {
          "options": {
            "provider": {
              "order": ["baseten"],
              "allow_fallbacks": false
            }
          }
        }
      }
    }
  }
}

Ollama (Local)

{
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (local)",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "llama2": {
          "name": "Llama 2"
        }
      }
    }
  }
}

LM Studio (Local)

{
  "provider": {
    "lmstudio": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "LM Studio (local)",
      "options": {
        "baseURL": "http://127.0.0.1:1234/v1"
      },
      "models": {
        "google/gemma-3n-e4b": {
          "name": "Gemma 3n-e4b (local)"
        }
      }
    }
  }
}

Amazon Bedrock

{
  "provider": {
    "amazon-bedrock": {
      "options": {
        "region": "us-east-1",
        "profile": "my-aws-profile"
      }
    }
  }
}

Azure OpenAI

  1. Create Azure OpenAI resource in Azure portal
  2. Deploy model in Azure AI Foundry
  3. Run /connect and search Azure
  4. Set AZURE_RESOURCE_NAME environment variable

Custom Provider Setup

For OpenAI-compatible providers:

{
  "provider": {
    "myprovider": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "My AI Provider",
      "options": {
        "baseURL": "https://api.myprovider.com/v1",
        "apiKey": "{env:MY_API_KEY}"
      },
      "models": {
        "my-model": {
          "name": "My Model",
          "limit": {
            "context": 200000,
            "output": 65536
          }
        }
      }
    }
  }
}

Troubleshooting

  1. Check authentication: Run opencode auth list to verify credentials
  2. Custom provider issues:
    • Verify provider ID matches between /connect and config
    • Confirm correct npm package
    • Check API endpoint in options.baseURL