Documentation/docs/managed-agents/llm-credentials

#LLM Credentials

Sandbox0 Managed Agents separates the API credential from the LLM credential.

The official SDK authenticates to Sandbox0 with a Sandbox0 API key. The model token is stored in a vault and attached to the session. This keeps model credentials out of application request headers and lets the runtime receive them through Sandbox0's credential boundary.

Reserved Vault Metadata#

An LLM vault is identified by reserved metadata keys:

json
{ "sandbox0.managed_agents.role": "llm", "sandbox0.managed_agents.engine": "claude", "sandbox0.managed_agents.llm_base_url": "https://api.anthropic.com" }
KeyRequiredDescription
sandbox0.managed_agents.roleYesMust be llm for an LLM vault
sandbox0.managed_agents.engineYesMust be claude today
sandbox0.managed_agents.llm_base_urlNoAnthropic-compatible LLM endpoint. Defaults to https://api.anthropic.com when omitted.

Only one role=llm vault can be attached to a session. The active credential in that vault must be a static_bearer credential.

Create An LLM Vault#

typescript
const llmVault = await client.beta.vaults.create({ display_name: "Claude LLM", metadata: { "sandbox0.managed_agents.role": "llm", "sandbox0.managed_agents.engine": "claude", "sandbox0.managed_agents.llm_base_url": "https://api.anthropic.com", }, }); await client.beta.vaults.credentials.create(llmVault.id, { display_name: "Anthropic API key", auth: { type: "static_bearer", token: process.env.ANTHROPIC_API_KEY, }, });

Do not set mcp_server_url on the LLM credential. mcp_server_url is for MCP credentials, not the model provider credential.

Use An Anthropic-compatible Provider#

The Claude engine expects an Anthropic-compatible API shape. To use a proxy or compatible provider, change only the LLM vault metadata and credential token.

typescript
const llmVault = await client.beta.vaults.create({ display_name: "Anthropic-compatible LLM", metadata: { "sandbox0.managed_agents.role": "llm", "sandbox0.managed_agents.engine": "claude", "sandbox0.managed_agents.llm_base_url": "https://anthropic-proxy.example.com", }, });

Keep the SDK baseURL pointed at Sandbox0 Managed Agents. The LLM base URL belongs in vault metadata.

Runtime Injection Model#

At runtime, Sandbox0 configures the wrapper with the selected LLM endpoint and injects the token through egress auth for the selected host. The generated code running inside the sandbox should not need the raw provider token.

The runtime may set placeholder environment variables for SDK compatibility, but the real credential is resolved and projected through Sandbox0-managed network policy.

Common Errors#

ErrorFix
Session has no LLM outputAttach the LLM vault in vault_ids when creating or updating the session
Multiple LLM vaults attachedAttach exactly one vault with sandbox0.managed_agents.role = llm
Engine mismatchUse sandbox0.managed_agents.engine = claude with Claude sessions
Credential has mcp_server_urlRemove mcp_server_url from the LLM credential
OpenAI-compatible endpointUse an Anthropic-compatible endpoint. OpenAI-compatible engine support is separate and not enabled by the Claude engine.

Some TypeScript SDK versions may type mcp_server_url as required for static_bearer. For LLM vault credentials, the runtime request should still omit it. Cast the auth object only if the SDK type requires it.

Next Steps#

Claude SDK

Point the official Anthropic SDK at Sandbox0 Managed Agents

Compatibility

Review the supported engine, event, environment, and lifecycle behavior