Skip to main content

Overview

Tembo Proxy is an AI gateway that provides access to tested and verified models from Anthropic, OpenAI, and Google. You connect your local coding tools — Claude Code, Codex, OpenCode, Cursor, and others — to Tembo’s infrastructure through a single API key, with no need to manage separate provider accounts. Tembo Proxy is available on all Tembo subscription plans, including the free tier. It works the same way as configuring any other model provider — set a base URL and API key, and your tool connects through Tembo.

Background

There are dozens of AI models available, but not all of them work well for coding tasks. Getting reliable results from coding agents requires models that handle long context, follow complex instructions, and produce consistent output.
Tembo tests models across real-world coding tasks before making them available through the proxy. Models listed on this page have been verified to work with the supported coding agents.
Tembo Proxy solves the multi-provider problem by acting as a single gateway. Instead of juggling API keys and accounts across Anthropic, OpenAI, and Google, you authenticate once with your Tembo API key and get access to all supported models. Tembo handles provider authentication, routing, and billing on your behalf.

How it Works

  1. Get your API key from the Tembo dashboard under SettingsAPI Keys
  2. Set environment variables or config files to point your coding tool at proxy.tembo.io
  3. Start coding — requests route through Tembo to the correct provider and stream back to your tool
Proxy usage consumes credits from your Tembo subscription. All plans include a credit allocation, and paid plans support overage billing so you’re never cut off mid-task. See Pricing below for details.

Endpoints

Models

The following models are available through Tembo Proxy. Use the endpoint and configuration that matches your coding tool.
ModelModel IDEndpointTools
Claude Opus 4.5claude-opus-4-5proxy.tembo.io/anthropicClaude Code, OpenCode, Cursor, Amp
Claude 4.5 Sonnetclaude-4-5-sonnetproxy.tembo.io/anthropicClaude Code, OpenCode, Cursor, Amp
Claude 4.1 Opusclaude-4.1-opusproxy.tembo.io/anthropicClaude Code, OpenCode, Cursor, Amp
Claude 4.5 Haikuclaude-4-5-haikuproxy.tembo.io/anthropicClaude Code, OpenCode, Cursor, Amp
Claude 4 Sonnetclaude-4-sonnetproxy.tembo.io/anthropicClaude Code, OpenCode, Cursor
Claude 3.5 Sonnetclaude-3-5-sonnetproxy.tembo.io/anthropicClaude Code, OpenCode
Claude 3.5 Haikuclaude-3-5-haikuproxy.tembo.io/anthropicClaude Code, OpenCode
GPT-5.2gpt-5.2proxy.tembo.io/openaiCodex, OpenCode, Cursor
GPT-5.1gpt-5.1proxy.tembo.io/openaiCursor
GPT-5.1 Codexgpt-5.1-codexproxy.tembo.io/proxy/openaiCodex, OpenCode, Cursor
GPT-5.1 Codex Maxgpt-5.1-codex-maxproxy.tembo.io/proxy/openaiCodex, OpenCode
GPT-5.1 Codex Minigpt-5.1-codex-miniproxy.tembo.io/proxy/openaiCodex, OpenCode
GPT-5.1 Codex Highgpt-5.1-codex-highproxy.tembo.io/proxy/openaiCodex, Cursor
GPT-5.2 Codexgpt-5.2-codexproxy.tembo.io/proxy/openaiCodex
GPT-5gpt-5proxy.tembo.io/openaiOpenCode
Gemini 2.5 Progemini-2.5-proproxy.tembo.io/anthropicClaude Code, OpenCode
Gemini 3 Progemini-3-proproxy.tembo.io/anthropicCursor
Gemini 3 Flashgemini-3-flashproxy.tembo.io/anthropicCursor
Kimi K2 (Bedrock)bedrock-kimi-k2-thinkingproxy.tembo.io/anthropicClaude Code, OpenCode
ZAI GLM-4.7zai-glm-4.7proxy.tembo.io/openaiOpenCode
GrokgrokCursor APICursor
Composer 1composer-1Cursor APICursor
Configure your tool with the format agent:model or agent:model:reasoningLevel for GPT-5 variants. For example: codex:gpt-5.2:high. For full agent configuration details, see Coding Agents.

Tool Configuration

export ANTHROPIC_BASE_URL=https://proxy.tembo.io/anthropic
export ANTHROPIC_API_KEY=your-tembo-api-key
For OpenCode with OpenAI models, configure opencode.json:
{
  "$schema": "https://opencode.ai/config.json",
  "model": "openai/gpt-5.2",
  "provider": {
    "openai": {
      "models": {
        "gpt-5.2": {
          "id": "gpt-5.2",
          "name": "GPT-5.2"
        }
      },
      "options": {
        "baseURL": "https://proxy.tembo.io/openai",
        "apiKey": "your-tembo-api-key",
        "headers": {
          "x-reasoning-level": "high"
        }
      }
    }
  }
}
Codex may store credentials in ~/.codex/auth.json that override environment variables. Remove or update that file if your environment variables aren’t taking effect.

Pricing

Proxy usage is billed through Tembo’s credit system. Every request consumes credits based on token usage — more complex tasks use more tokens and therefore more credits.
PlanMonthly CostCredits IncludedOverage
Free$05 credits/dayNone (paused until refresh)
Pro$60100 credits/monthPay-as-you-go
Max$200400 credits/monthPay-as-you-go
Anthropic and Google models require no additional API keys — Tembo provides access directly. OpenAI models require you to supply your own OpenAI API key alongside your Tembo API key.

Auto-reload

Paid plans support auto-reload to prevent interruptions. When your credit balance drops below a threshold, Tembo automatically purchases additional credits. Configure the reload threshold, target balance, and monthly limit from Billing settings.

Monthly Limits

You can set a maximum overage limit on paid plans to control spending. The auto-reload system respects this limit — even if your balance drops below the threshold, reloads won’t exceed your monthly cap. For billing questions, contact support@tembo.io. See Billing for full details.

Privacy

All Tembo infrastructure is hosted in the United States, with no infrastructure in China. Tembo is SOC 2 Type 1 certified and in SOC 2 Type 2 observation. Tembo maintains zero data retention agreements with providers where available. Current retention policies by provider:
ProviderData Retention
AnthropicZero retention — code and prompts are not stored or used for training
GoogleZero retention
OpenAI30-day retention (OpenAI policy)
For full security details, see Security.

For Teams

Tembo Proxy works with team accounts. Administrators manage access, monitor usage, and control spending from the dashboard.

Roles

  • Admins can create and revoke API keys, invite team members, set spending limits, and view usage across the team
  • Members receive their own API key and can use the proxy within the limits set by their admin

Model Access

Per-model access controls for team administrators are coming soon. Today, all team members have access to all models available on your plan.

Bring Your Own Key

For OpenAI models, team members supply their own OpenAI API key alongside the Tembo API key. Anthropic and Google models are accessed directly through Tembo — no additional keys needed.
Bring-your-own-key support for additional providers is planned. Contact support@tembo.io for details.
See Inviting Your Team for setup instructions.

Goals

  • Simplify multi-provider access — One API key, one billing account, all major model providers
  • Verify model quality — Only expose models that have been tested and verified for coding agent use cases
  • Enable team collaboration — Shared billing, usage monitoring, and access controls for engineering teams
  • Stay provider-agnostic — Support the best models regardless of provider, so you can switch without reconfiguring infrastructure