Overview
Streams an AI-generated answer to a natural language question, grounded in your project’s content. Uses SSE (Server-Sent Events) to deliver tokens incrementally. Returns source citations once the stream completes. For integration guidance, see Ask Content.Usage Example
Parameters
Callask with:
The natural language question.
Limit the content types searched for context.
Scope context lookup to a specific space.
Scope context lookup to a specific conversation.
Maximum number of source records to use as context.
Returns
The AI-generated answer. Grows token by token while
streaming is true.Source records used as context. Populated after streaming ends.
true while the SSE stream is open and tokens are arriving.true from ask() call until the first token arrives.Error message if the request failed.
Starts a new question. Aborts any in-flight stream first.
Aborts in-flight stream and clears all state.
In React Native, SSE streaming requires polyfills. Install
react-native-fetch-api, web-streams-polyfill, and react-native-polyfill-globals, then call polyfillGlobals() at app startup before using this hook.
