Skip to content

fix: convert litellm response with reasoning content to openai message #1098

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

zkllll2002
Copy link

1. Description

This PR fixes an issue where reasoning content from models accessed via LiteLLM was not being correctly parsed into the ChatCompletionMessage format. This was particularly noticeable when using reasoning models.

2. Context

I am using the openai-agents-python library in my project, and it has been incredibly helpful. Thank you for building such a great tool!

My setup uses litellm to interface with gemini-2.5-pro. I noticed that while the agent could receive a response, the reasoning(thinking) from the Gemini model was lost during the conversion process from the LiteLLM response format to the OpenAI ChatCompletionMessage object.

I saw that PR #871 made progress on a similar issue, but it seems the specific response structure from LiteLLM still requires a small adaptation. This fix adds the necessary logic to ensure that these responses are handled.

Relates to: #871

3. Key Changes

  • LitellmConverter.convert_message_to_openai: add reasoing_content
  • Converter.items_to_messages: just pass the reasoning item

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants