Kontext works great with React and the Next.js App Router. You’ll wire up a provider, connect a user via OAuth, fetch per‑user context on the server, and stream responses to the client. What you’ll build:
  • A personalized chat that uses per‑turn userQuery
  • Optional realtime updates to reflect fresh facts

1) Install

Packages:
npm install @kontext.dev/kontext-sdk ai @ai-sdk/openai
Environment variables:
  • Client: NEXT_PUBLIC_KONTEXT_API_URL, NEXT_PUBLIC_KONTEXT_API_KEY (for OAuth trigger + React hooks)
  • Server: KONTEXT_API_KEY (route handlers), DEFAULT_PRIVACY_LEVEL (strict default)
Example .env:
NEXT_PUBLIC_KONTEXT_API_URL=https://api.kontext.dev
NEXT_PUBLIC_KONTEXT_API_KEY=ktext_public_...
KONTEXT_API_KEY=ktext_server_...
DEFAULT_PRIVACY_LEVEL=strict
OPENAI_API_KEY=sk-...

2) Provider

Create app/providers.tsx and wrap your root layout. This exposes hooks (e.g., useKontext()) and holds userId after OAuth.
// app/providers.tsx
'use client'
import { ReactNode } from 'react'
import { KontextProvider } from '@kontext.dev/kontext-sdk/react'

export default function Providers({ children }: { children: ReactNode }) {
  return (
    <KontextProvider
      apiKey={process.env.NEXT_PUBLIC_KONTEXT_API_KEY}
      apiUrl={process.env.NEXT_PUBLIC_KONTEXT_API_URL}
    >
      {children}
    </KontextProvider>
  )
}
Wrap your root layout:
// app/layout.tsx
import Providers from './providers'

export default function RootLayout({ children }: { children: React.ReactNode }) {
  return (
    <html lang="en">
      <body>
        <Providers>{children}</Providers>
      </body>
    </html>
  )
}

3) Connect OAuth

Show a “Connect Gmail” button that starts the OAuth flow. On callback, KontextProvider stores ?user_id=... for you.
'use client'
import { useKontext } from '@kontext.dev/kontext-sdk/react'

export function ConnectGmail() {
  const { userId } = useKontext()

  async function connect() {
    const url = new URL('/oauth/gmail', process.env.NEXT_PUBLIC_KONTEXT_API_URL)
    url.searchParams.set('redirect_uri', window.location.origin)

    const res = await fetch(url.toString(), {
      headers: { 'x-api-key': process.env.NEXT_PUBLIC_KONTEXT_API_KEY! },
    })
    const { authUrl } = await res.json()
    window.location.href = authUrl
  }

  return (
    <button onClick={connect} disabled={!!userId}>
      {userId ? 'Connected' : 'Connect Gmail'}
    </button>
  )
}

4) Server Route: Get Context

Add app/api/completion/route.ts to fetch per‑user context and stream an LLM response.
// app/api/completion/route.ts
import { NextResponse } from 'next/server'
import { streamText } from 'ai'
import { createOpenAI } from '@ai-sdk/openai'
import { Persona } from '@kontext.dev/kontext-sdk'

const openai = createOpenAI({ apiKey: process.env.OPENAI_API_KEY })

export async function POST(req: Request) {
  const { prompt, userId } = await req.json()

  const persona = new Persona({
    apiKey: process.env.KONTEXT_API_KEY!,
    apiUrl: process.env.NEXT_PUBLIC_KONTEXT_API_URL,
  })

  const ctx = await persona.getContext({
    userId,
    task: 'chat',
    userQuery: prompt,
  })

  const result = await streamText({
    model: openai('gpt-4o-mini'),
    system: ctx.systemPrompt,
    prompt,
  })

  return result.toAIStreamResponse()
}
Notes:
  • userQuery personalizes retrieval per turn.
  • Privacy is fixed server‑side; clients can’t change it.

5) Client Chat

Minimal chat UI that POSTs to your route and streams the response.
'use client'
import { useKontext } from '@kontext.dev/kontext-sdk/react'
import { useState } from 'react'

export function Chat() {
  const { userId } = useKontext()
  const [input, setInput] = useState('')
  const [answer, setAnswer] = useState('')

  async function onSubmit(e: React.FormEvent) {
    e.preventDefault()
    const res = await fetch('/api/completion', {
      method: 'POST',
      headers: { 'content-type': 'application/json' },
      body: JSON.stringify({ prompt: input, userId }),
    })
    const reader = res.body!.getReader()
    const decoder = new TextDecoder()
    setAnswer('')
    while (true) {
      const { done, value } = await reader.read()
      if (done) break
      setAnswer((s) => s + decoder.decode(value))
    }
  }

  return (
    <form onSubmit={onSubmit}>
      <input value={input} onChange={(e) => setInput(e.target.value)} />
      <button disabled={!userId}>Send</button>
      <pre>{answer}</pre>
    </form>
  )
}

6) Realtime (Optional)

Add useRealtimeContext() to reflect updates as new facts arrive (requires REDIS_HOST on the API). Use usePollingContext() as a fallback.
'use client'
import { useKontext, useRealtimeContext } from '@kontext.dev/kontext-sdk/react'

export function LiveContextBadge() {
  const { userId } = useKontext()
  const { latestEvent } = useRealtimeContext({ userId })
  if (!latestEvent) return null
  return <small>Updated: {latestEvent.type} at {latestEvent.timestamp}</small>
}

7) Datasets (Optional)

Server‑side helpers for ingestion and QA:
// Upload
await persona.datasets.upload(file, { asUser: userId })

// Status
await persona.datasets.getStatus({ jobId })

// Facts‑first QA (no vector fallback)
await persona.datasets.query({ datasetId, query, userId })

// Vector search
await persona.datasets.search({ datasetId, query, userId, topK: 10 })

Security & Privacy

  • Never expose KONTEXT_API_KEY in the browser (use server routes)
  • Server enforces DEFAULT_PRIVACY_LEVEL (strict by default)
  • Names are not redacted; medical/financial/IDs/addresses are sanitized at context build time

Troubleshooting

  • 403 on context: ensure OAuth consent exists for (developerId, userId)
  • 404 on dataset query: facts‑first returned no facts; use search or upload with asUser
  • Realtime not firing: ensure REDIS_HOST configured on the API