Supabase provides Postgres with pgvector for AI applications. With Portkey, generate embeddings using 1600+ models and store them in Supabase for efficient retrieval.
Prerequisites
Setup
1. Install Dependencies
pip install portkey-ai supabase
2. Prepare Database
Create Supabase Project
Go to Supabase and create a new project. Enable pgvector
In Database → Extensions, enable pgvector. Or run: Create Documents Table
Create a table for documents and embeddings. OpenAI’s text-embedding-ada-002 outputs 1536 dimensions:create table documents (
id bigserial primary key,
content text,
embedding vector(1536)
);
from portkey_ai import Portkey
from supabase import create_client, Client
# Supabase
supabase: Client = create_client(
"YOUR_SUPABASE_PROJECT_URL",
"YOUR_SUPABASE_API_KEY"
)
# Portkey
client = Portkey(
api_key="YOUR_PORTKEY_API_KEY",
provider="@openai-prod"
)
Generate & Store Embeddings
# Generate embedding
response = client.embeddings.create(
model="text-embedding-ada-002",
input="The food was delicious and the waiter..."
)
embedding = response.data[0].embedding
# Store in Supabase
result = supabase.table('documents').insert({
"content": "The food was delicious and the waiter...",
"embedding": embedding
}).execute()
Switch Providers
Change the provider to use different embedding models:
# Cohere embeddings (1024 dimensions)
client = Portkey(
api_key="YOUR_PORTKEY_API_KEY",
provider="@cohere-prod"
)
response = client.embeddings.create(
model="embed-english-v3.0",
input_type="search_query",
input="The food was delicious and the waiter..."
)
Cohere’s embed-english-v3.0 outputs 1024 dimensions. Create a separate table with vector(1024) for Cohere embeddings.
Next Steps