Skip to main content

Why this demo exists

Most edge-runtime PDF attempts fail on Node-specific dependencies (fs, pdf-parse, pdfjs-dist assumptions).
This demo uses OkraPDF as a network-native PDF layer so your edge function only does fetch().

What it proves

  1. Upload PDF by URL and store doc_id in your own app database.
  2. Send user questions + doc_id to Okra’s OpenAI-compatible endpoint.
  3. Stream responses back in real-time from an edge function.
No local PDF parsing, no vector schema migration, no runtime-specific workarounds.

Security and performance angle

  • Your edge code never handles raw PDF parsing internals.
  • The app stores stable document references (doc_id) instead of large blobs.
  • Streaming chat stays fast because retrieval happens close to the document system.

Demo Source

runtime-demo/demos/supabase-chatpdf

Build path

1

Deploy upload endpoint

Accept a PDF URL, call OkraPDF once, save returned doc_id.
2

Deploy chat endpoint

Forward chat turns plus doc_id to Okra’s model endpoint and stream result.
3

Ship UI

Keep frontend simple: URL input, document list, and chat panel.