-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Add logprobs to ModelSettings #971
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
…nt-responses Add logprob support for Responses API
@@ -114,6 +114,20 @@ english_agent = Agent( | |||
) | |||
``` | |||
|
|||
Responses API でトークンの対数確率を取得したい場合は、 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We generate translated docs using script, so please revert this change.
@@ -109,6 +109,20 @@ english_agent = Agent( | |||
) | |||
``` | |||
|
|||
You can also request token log probabilities when using the Responses API by |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Once this property is available at the top level, it will be listed here, which is linked from this page. So, could you revert this one too?
@@ -107,6 +108,10 @@ class ModelSettings: | |||
"""Additional output data to include in the model response. | |||
[include parameter](https://platform.openai.com/docs/api-reference/responses/create#responses-create-include)""" | |||
|
|||
top_logprobs: int | None = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this would be used in not only responses, but LiteLLM and Chat Completions, so can you update the following files as well?
- src/agents/models/openai_chatcompletions.py
- src/agents/extensions/models/litellm_model.py
Now that logprobs are available in the Responses API, I thought it might be nice to have it be available for agents.