-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Accessing Stream Chunks (Streamed generation) #36
Comments
sorry was planning to fix this earlier was in the middle of our big migration to a monorepo. looking into this now. |
fix in latest release |
I would like to reopen this issue. Number two, i dont understand on how to do it with .chat. I can see that it is suppose to return a Readable Stream, and i have set the stream to true. But i cannot get it to work. Any example or ideas @dosco ? |
@taieb-tk have you looked at the streaming1.ts and streaming2.ts examples? |
@dosco `const ai = new ax.AxAIOpenAI({
` Not sure what to do with the response in the next step... Could you possibly help me? :) |
I'm submitting a ...
[x] question about how to use this project
Summary
I'm encountering two problems when working with the streaming example:
examples/streaming2.ts
withstream: true
, I get an error:Missing required fields: answerInPoints
. What's causing this error and how can I resolve it?stream: true
, how can I access the result chunks? Are there methods similar tofor await (const chunk of result)
orcompletion.data.on()
that I can use to process the incoming stream? (Similar to How to use stream: true? openai/openai-node#18)Any guidance on resolving these issues would be greatly appreciated. Thank you!
The text was updated successfully, but these errors were encountered: