Skip to main content
OpenAI Agent Builder is a visual canvas for creating multi-step agent workflows. Export production-ready Python code and add Portkey for:
  • Complete observability of agent workflows
  • Cost tracking and optimization
  • Reliability features (fallbacks, retries)
  • Access to 1600+ LLMs
  • Guardrails for safe agent behavior

Quick Start

1

Design in Agent Builder

Open OpenAI Agent Builder and create your workflow using the visual canvas.
2

Export Code

Click CodeAgents SDK to get the Python implementation.
3

Install Packages

pip install -U openai-agents portkey-ai
4

Add Portkey

Replace the OpenAI client with Portkey:
from agents import Agent, Runner, set_default_openai_client, set_default_openai_api
from openai import AsyncOpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders

client = AsyncOpenAI(
    base_url=PORTKEY_GATEWAY_URL,
    api_key="YOUR_PORTKEY_API_KEY",
    default_headers=createHeaders(provider="@openai-prod")
)
set_default_openai_client(client, use_for_tracing=False)
set_default_openai_api("chat_completions")

# Your Agent Builder workflow code continues as exported...
Update the model field to use Portkey’s format: @openai-prod/gpt-4o

Setup

1

Add Provider in Model Catalog

Go to Model Catalog → Add Provider. Select your provider (OpenAI, Anthropic, etc.), enter API keys, and name it (e.g., openai-prod).Your provider slug is @openai-prod.
2

Get Portkey API Key

Create an API key at app.portkey.ai/api-keys.Pro tip: Attach a config for fallbacks, caching, and guardrails—applies automatically.

Production Features

Observability

All workflow executions are logged:
Add trace IDs and metadata for filtering:
default_headers=createHeaders(
    provider="@openai-prod",
    trace_id="workflow-session-123",
    metadata={
        "workflow_type": "research",
        "_user": "user_123",
        "environment": "production"
    }
)
Analytics with metadata filters

Reliability

Enable fallbacks via Configs:
client = AsyncOpenAI(
    base_url=PORTKEY_GATEWAY_URL,
    api_key="YOUR_PORTKEY_API_KEY",
    default_headers=createHeaders(
        config={
            "strategy": { "mode": "fallback" },
            "targets": [
                { "override_params": { "model": "@openai-prod/gpt-4o" } },
                { "override_params": { "model": "@anthropic-prod/claude-sonnet-4" } }
            ]
        }
    )
)

Guardrails

Add input/output validation:
default_headers=createHeaders(
    provider="@openai-prod",
    config={
        "input_guardrails": ["guardrail-id-xxx"],
        "output_guardrails": ["guardrail-id-yyy"]
    }
)
Guardrails can:
  • Detect and redact PII
  • Filter harmful content
  • Validate response formats
  • Apply custom business rules

Guardrails Guide

PII detection, content filtering, and custom rules

Caching

Reduce costs with response caching:
default_headers=createHeaders(
    provider="@openai-prod",
    config={ "cache": { "mode": "semantic" } }
)

Prompt Templates

Use Portkey’s prompt management for versioned prompts:
from portkey_ai import Portkey

portkey = Portkey(api_key="YOUR_PORTKEY_API_KEY")

prompt_data = portkey.prompts.render(
    prompt_id="YOUR_PROMPT_ID",
    variables={"task": "research"}
)

agent = Agent(
    name="Assistant",
    instructions=prompt_data.data.messages[0]["content"],
    model="@openai-prod/gpt-4o"
)

Prompt Engineering Studio

Prompt versioning and collaboration

Switching Providers

Use any of 1600+ models:
# OpenAI
createHeaders(provider="@openai-prod")

# Anthropic
createHeaders(provider="@anthropic-prod")

# Google
createHeaders(provider="@google-prod")

Supported Providers

See all 1600+ supported models

Enterprise Governance

Set up centralized control for your workflows.
1

Add Provider with Budget

Go to Model Catalog → Add Provider. Set budget limits and rate limits.
2

Create Config

Go to Configs:
{
  "override_params": { "model": "@openai-prod/gpt-4o" }
}
3

Create Team API Keys

Go to API Keys. Create keys per team, attach configs.
4

Distribute to Teams

Teams use their Portkey API key:
client = AsyncOpenAI(
    base_url=PORTKEY_GATEWAY_URL,
    api_key="TEAM_PORTKEY_API_KEY"  # Config attached to key
)

Enterprise Features

Governance, security, and compliance

FAQ

Yes. Export your workflow, add Portkey client initialization, and your code works unchanged.
Yes. Handoffs, tools, guardrails—all work with Portkey observability and reliability.
Add metadata to your requests. Filter by workflow type, user, or environment in the dashboard.
Yes. Portkey stores your provider keys securely. Rotate keys without code changes.

Resources