Documentation

Get started with BedrockRouter

BedrockRouter gives you one OpenAI- and Anthropic-compatible API for every foundation model on AWS Bedrock. The proxy runs entirely inside your own AWS account — your data never leaves your infrastructure.

Architecture

BedrockRouter is a thin, OpenAI-compatible proxy that lives in your AWS account. Requests authenticate with a Bedrock API key, get translated into the Bedrock InvokeModel / Converse protocol, and stream straight back to your client.

Your App

OpenAI / Anthropic SDK

Lambda + Function URL

BedrockRouter Proxy

AWS Bedrock

Claude / MiniMax / etc.

1. Authenticated request

Your client sends a standard chat completion request with a Bedrock API key in the Authorization header.

2. Protocol translation

The Lambda translates OpenAI/Anthropic-shaped requests into Bedrock's native protocol and signs them with the API key.

3. Streaming response

Tokens stream back over SSE, transformed into the response shape your SDK expects.

Runs in your account

The Lambda, IAM role, and API key all live inside the AWS account you control. We never see traffic or credentials.

Native Bedrock API key

No SigV4 signing, no SDK credentials. Just a single bearer token that maps to your Bedrock IAM permissions.

Quickstart

Get a working request in under five minutes.

  1. 1

    Create an account

    Sign up at BedrockRouter to track your deployments and integrations.

    Create account →
  2. 2

    Deploy the CloudFormation stack

    Click Deploy, follow the AWS console wizard, and wait ~60 seconds for the stack to finish.

    Deploy stack →
  3. 3

    Copy your endpoint and API key

    Open the stack's Outputs tab. Copy ProxyURL and ApiKey.

  4. 4

    Make your first request

    Use any OpenAI-compatible client.

    bash
    curl https://YOUR_PROXY_URL/v1/chat/completions \
      -H "Authorization: Bearer YOUR_API_KEY" \
      -H "Content-Type: application/json" \
      -d '{
        "model": "anthropic.claude-sonnet-4-5-20250929-v1:0",
        "messages": [{"role": "user", "content": "Hello, Bedrock!"}]
      }'

SDK examples

BedrockRouter speaks both OpenAI and Anthropic protocols. Pick the SDK you already use.

OpenAI Python

python
from openai import OpenAI

client = OpenAI(
    base_url="https://YOUR_PROXY_URL/v1",
    api_key="YOUR_API_KEY",
)

response = client.chat.completions.create(
    model="anthropic.claude-sonnet-4-5-20250929-v1:0",
    messages=[{"role": "user", "content": "Explain Bedrock in one sentence."}],
)

print(response.choices[0].message.content)

Anthropic TypeScript

typescript
import Anthropic from "@anthropic-ai/sdk";

const client = new Anthropic({
  baseURL: "https://YOUR_PROXY_URL",
  apiKey: "YOUR_API_KEY",
});

const message = await client.messages.create({
  model: "anthropic.claude-sonnet-4-5-20250929-v1:0",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello, Bedrock!" }],
});

console.log(message.content);

Vercel AI SDK

typescript
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

const bedrock = createOpenAI({
  baseURL: "https://YOUR_PROXY_URL/v1",
  apiKey: process.env.BEDROCK_API_KEY,
});

const { text } = await generateText({
  model: bedrock("minimax.minimax-m2-v1:0"),
  prompt: "Write a haiku about distributed systems.",
});

Authentication

BedrockRouter uses standard bearer token authentication. Pass your AWS Bedrock API key in the Authorization header on every request.

http
Authorization: Bearer brk_live_xxxxxxxxxxxxxxxxxxxx

Keys are scoped to your AWS account's Bedrock IAM permissions. Rotate or revoke them at any time from the AWS console — no code changes required on the proxy.

Pricing & billing

BedrockRouter charges zero markup. You pay AWS Bedrock's standard on-demand rates directly via your AWS bill. The Lambda proxy is metered separately under AWS Lambda — typically a few cents per million requests.

See full per-model pricing on the Models page.

FAQ

Does BedrockRouter see my prompts or responses?

No. The proxy Lambda runs inside your AWS account. We never receive traffic, prompts, completions, or API keys.

Which AWS regions are supported?

Any region where AWS Bedrock is available. The CloudFormation stack accepts a region parameter at deploy time.

Is streaming supported?

Yes. Server-Sent Events streaming is fully supported for both OpenAI and Anthropic API formats.

Can I use this with Claude Code, Cursor, or other tools?

Yes. Any tool that accepts an OpenAI- or Anthropic-compatible base URL and API key works out of the box.

How do I uninstall?

Delete the CloudFormation stack from the AWS console. All resources — Lambda, IAM role, API key — are removed cleanly.

Ready to deploy?

Stand up the full proxy in your AWS account in under two minutes.

Deploy stack