Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

functionCallResult never get's called in the frontend with ChatCompletionStream #1060

Open
1 task done
cosbgn opened this issue Sep 11, 2024 · 7 comments
Open
1 task done
Labels
bug Something isn't working

Comments

@cosbgn
Copy link

cosbgn commented Sep 11, 2024

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

In my backend I do:

const steam = client.beta.chat.completions.runTools({})
return steam.toReadableStream()

And in my frontend I do:

const runner = ChatCompletionStream.fromReadableStream( response.body )
runner.on('message', (message) => {
    messages.value.push(message)
})

This works but after the role:assistant with tool_calls it doesn't push the tool_call results to the messages array. In the frontend also runner.on('functionCallResult') never gets called.

This means that next calls fail because after a tool call we must pass the tool call results.

How can I get the tool call function response to show up in messages?

To Reproduce

  1. return a stream to the frontend:

const steam = client.beta.chat.completions.runTools({})
return steam.toReadableStream()

  1. read the stream with ChatCompletionStream

  2. fail to get the function response as message

P.s I've also tested with ChatCompletionStreamingRunner and it's the same

Code snippets

No response

OS

mac latest

Node version

22

Library version

latest

@cosbgn cosbgn added the bug Something isn't working label Sep 11, 2024
@cosbgn
Copy link
Author

cosbgn commented Sep 11, 2024

@cosbgn
Copy link
Author

cosbgn commented Sep 11, 2024

After some investigation, when role=="tool" the message never gets added to the stream.

in other words this will log only in the backend and never in the frontend.

	runner.on("message", (message) => {
		if (message.role === "tool"){
			console.log(message) // Works on backend but not on frontend
		}
	})

Is there a way to manually push it as a chunk? Is there anything I can do to at least send it as extra data and then somehow add it back in the correct index in my messages in the frontend? I really need this data and openai will anyways crush on my next message as it doesn't have the toolCall respose.

@RobertCraigie
Copy link
Collaborator

@cosbgn thanks for the report, I think you might be using the wrong stream class on the frontend, can you try using ChatCompletionStreamingRunner.fromReadableStream() instead?

@cosbgn
Copy link
Author

cosbgn commented Sep 11, 2024 via email

@blechatellier
Copy link

blechatellier commented Sep 15, 2024

Running into the same issue, I can create a custom ReadableStream and manually enqueue the function result but not sure about the shape of the chunk so that it gets read correctly on the frontend. @RobertCraigie any pointers please?

@RobertCraigie
Copy link
Collaborator

Thanks for reporting, I can reproduce the issue.

@blechatellier unfortunately I'm not currently aware of a full workaround, I think you'd have to manually enqueue chunks in your own format and then call runner._addMessage({ role: 'tool', name, content }); manually to have the functionCallResult event emitted.

We'll look into fixing this.

@cosbgn
Copy link
Author

cosbgn commented Sep 19, 2024

@RobertCraigie I'm trying something like this in the backend:

.on("message", async (message) => {
    if (message.role === "tool" && !added_tool_call_ids.includes(message.tool_call_id)) {
      added_tool_call_ids.push(message.tool_call_id)
      runner._addMessage(message)
    }

However it fails because I believe it has already the role=tool in it:

400 Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.

Do you have an example on how I could get this to work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants