Vector Store
Vector database for AI/ML embeddings powered by pgvector. Perfect for RAG applications, semantic search, and recommendation systems.
Features
- PostgreSQL-based with pgvector extension
- Support for multiple distance functions (L2, inner product, cosine)
- HNSW and IVFFlat indexing
- Hybrid search (vector + full-text)
- Automatic index optimization
- Compatible with popular embedding models
Plans
| Plan | Dimensions | Vectors | Price |
|---|---|---|---|
| Starter | 1536 | 100K | $18/mo |
| Standard | 3072 | 1M | $54/mo |
| Pro | 4096 | 10M | $135/mo |
Quick Start
# Create a vector store
szc vector create my-vectors --plan starter --dimensions 1536
# Get connection string
szc vector info my-vectors
Usage with Python
import psycopg2
from pgvector.psycopg2 import register_vector
conn = psycopg2.connect("your-connection-string")
register_vector(conn)
# Create a table
cur = conn.cursor()
cur.execute("""
CREATE TABLE documents (
id SERIAL PRIMARY KEY,
content TEXT,
embedding vector(1536)
)
""")
# Insert vectors
cur.execute(
"INSERT INTO documents (content, embedding) VALUES (%s, %s)",
("Hello world", embedding)
)
# Search
cur.execute("""
SELECT content, embedding <-> %s AS distance
FROM documents
ORDER BY distance
LIMIT 5
""", (query_embedding,))
Indexing
Create an HNSW index for faster queries:
CREATE INDEX ON documents
USING hnsw (embedding vector_cosine_ops)
WITH (m = 16, ef_construction = 64);
OpenAI Integration
from openai import OpenAI
client = OpenAI()
def get_embedding(text):
response = client.embeddings.create(
model="text-embedding-3-small",
input=text
)
return response.data[0].embedding