Skip to main content
Build a lightweight app that sends video segments to Interhuman over WebSocket and renders social signals in real time—no need to wait for a full upload.

Watch the codealong video

Get the stream connected end-to-end (camera → WebSocket → logs) before polishing the UI. Keep your terminal visible to spot connection or auth errors.

Steps at a glance

  1. Project setup — Minimal frontend (e.g. React or Next.js) with upload + results layout.
  2. Authentication — Exhange your key for a token (Authentication).
  3. Connect — WebSocket to wss://api.interhuman.ai/v0/stream/analyze with your token (Streaming API).
  4. Capture & sendgetUserMedia + MediaRecorder for WebM segments; send binary over the socket.
  5. Handle messages — Listen for processing, result, completed, and error.
  6. Display signals — Update your UI from the signals array in each result.

Next steps