Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Log LLM tool call for streamed response #545
Log LLM tool call for streamed response #545
Changes from 16 commits
a55495d
8207557
7e7ca8a
6b306bc
e603489
bf70fe0
d27fb6f
a959b31
c8a940c
3137921
57963b6
5f4c326
c93c5a5
5286f87
e88f71f
6119fef
895460f
7045b77
b3b31c6
b5b26e8
c3d813e
6f512b8
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I worry about this having a significantly different shape from the other response data dicts.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the same shape as the existing non-streamed chat completion, which is why it displays nicely in the UI.
logfire/logfire/_internal/integrations/llm_providers/openai.py
Lines 81 to 91 in af707e2
I'll admit it's not exactly the same: the message object here is a
ParsedChatCompletionMessage
, a subclass of the non-streamed chat completion messageChatCompletionMessage
. It has the "parsed" and tool call "parsed_arguments" fields added. These "ParsedX" classes are those returned by theclient.beta.chat.completions.parse
method.Aside: it would be handy to have the response dicts that have special handling on the frontend be documented or codified (e.g. using TypedDicts), ideally with the order of precedence. For example I found that "combined_chunk_content" takes precedence over "message", which is why I excluded it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, i didn't realise we were already using this shape. yes, creating some types sounds good. @dmontagu any thoughts? should we consider these shapes stable?