Skip to main content

Overview

Streams an AI-generated answer to a natural language question, grounded in your project’s content. Uses SSE (Server-Sent Events) to deliver tokens incrementally. Returns source citations once the stream completes. For integration guidance, see Ask Content.

Usage Example

import { useAskContent } from "@replyke/react-js";

function AskUI() {
  const { answer, sources, streaming, loading, error, ask, reset } = useAskContent();

  return (
    <div>
      <button onClick={() => ask({ query: "How do I reset my password?" })} disabled={loading || streaming}>
        Ask
      </button>
      {answer && <p>{answer}{streaming && " ▋"}</p>}
      {sources.map((s) => <div key={s.record.id}>[{s.sourceType}] {s.record.id}</div>)}
    </div>
  );
}

Parameters

Call ask with:
query
string
required
The natural language question.
sourceTypes
("entity" | "comment" | "message")[]
Limit the content types searched for context.
spaceId
string
Scope context lookup to a specific space.
conversationId
string
Scope context lookup to a specific conversation.
limit
number
Maximum number of source records to use as context.

Returns

answer
string
The AI-generated answer. Grows token by token while streaming is true.
sources
ContentSearchResult[]
Source records used as context. Populated after streaming ends.
streaming
boolean
true while the SSE stream is open and tokens are arriving.
loading
boolean
true from ask() call until the first token arrives.
error
string | null
Error message if the request failed.
ask
(props) => void
Starts a new question. Aborts any in-flight stream first.
reset
() => void
Aborts in-flight stream and clears all state.
In React Native, SSE streaming requires polyfills. Install react-native-fetch-api, web-streams-polyfill, and react-native-polyfill-globals, then call polyfillGlobals() at app startup before using this hook.