Skip to main content
Langflow is an open-source visual framework for building multi-agent and RAG applications. Portkey adds enterprise controls:
  • 1600+ LLMs — Single interface for all providers, not just OpenAI
  • Observability — Real-time tracking for 40+ metrics and logs
  • Governance — Budget limits, rate limits, and RBAC
  • Guardrails — PII detection, content filtering, compliance controls
For enterprise governance setup, see Enterprise Governance.

Quick Start

1. Setup Portkey

1

Add Provider

Go to Model CatalogAdd Provider.
2

Configure Credentials

Select your provider (OpenAI, Anthropic, etc.), enter your API key, and create a slug like openai-prod.
3

Create Config

Go to Configs and create:
{
  "override_params": {
    "model": "@openai-prod/gpt-4o"
  }
}
Save and note the Config ID.
4

Get Portkey API Key

Go to API Keys → Create new key → Attach your config → Save.

2. Configure Langflow

1

Install Langflow

Install via Docker, pip, or desktop app. See Langflow docs.
2

Open a Flow

Create or open a flow with an OpenAI model component.
3

Configure OpenAI Component

Click the OpenAI component → Controls:
4

Add Portkey Settings

  • Base URL: https://api.portkey.ai/v1
  • API Key: Your Portkey API key
Done! Monitor requests in the Portkey Dashboard.

Enterprise Governance

For organizations using Langflow, Portkey adds governance controls:
Create providers with spending limits per team:
  1. Go to Model Catalog
  2. Create provider for each team with budget/rate limits
Control which models teams can access at the integration level.
Use Configs for fallbacks, load balancing, caching:
{
  "strategy": { "mode": "loadbalance" },
  "targets": [
    { "override_params": { "model": "@openai-prod/gpt-4o" } },
    { "override_params": { "model": "@anthropic-prod/claude-sonnet-4-20250514" } }
  ]
}
Create configs at app.portkey.ai/configs.
Create team-specific API keys with metadata:
from portkey_ai import Portkey

portkey = Portkey(api_key="YOUR_ADMIN_API_KEY")

api_key = portkey.api_keys.create(
    name="frontend-team",
    workspace_id="YOUR_WORKSPACE_ID",
    defaults={
        "config_id": "your-config-id",
        "metadata": {"team": "frontend", "environment": "production"}
    },
    scopes=["logs.view", "configs.read"]
)

Features

Observability

Track 40+ metrics: cost, tokens, latency, performance. Filter by custom metadata.

Logs

Complete request/response tracking with metadata tags and cost attribution.

Reliability

Guardrails

Protect workflows with real-time checks:
  • PII detection and masking
  • Content filtering
  • Custom security rules

Guardrails

Configure input/output protection

FAQs

Yes. Any component supporting custom base URLs works with Portkey. Set base URL to https://api.portkey.ai/v1.
Yes. Create multiple providers and attach them to a single config. The config connects to your API key.
Create separate providers per team, use metadata tags, or set up team-specific API keys.

Next Steps

Community: Discord · GitHub