Skip to content

litellm proxy with openai-agents #696

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
sumit-lightringai opened this issue May 15, 2025 · 3 comments
Open

litellm proxy with openai-agents #696

sumit-lightringai opened this issue May 15, 2025 · 3 comments
Labels
question Question about using the SDK stale

Comments

@sumit-lightringai
Copy link

Please read this first

version=0.0.14

Error

66.04 ERROR: Cannot install litellm>=1.67.4.post1, litellm[proxy]==1.67.4.post1, litellm[proxy]==1.67.5, litellm[proxy]==1.67.6, litellm[proxy]==1.68.0, litellm[proxy]==1.68.1, litellm[proxy]==1.68.2, litellm[proxy]==1.69.0, litellm[proxy]==1.69.1, litellm[proxy]==1.69.2 and openai-agents because these package versions have conflicting dependencies. 66.04 66.04 The conflict is caused by: 66.04 openai-agents 0.0.14 depends on mcp<2 and >=1.6.0; python_version >= "3.10" 66.04 litellm[proxy] 1.69.2 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.69.1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.69.0 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.2 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.0 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.67.6 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 The user requested litellm>=1.67.4.post1 66.04 litellm[proxy] 1.67.5 depends on litellm 1.67.5 (from https://files.pythonhosted.org/packages/99/dc/e4db6b72347446893cab8599b0a043b8883e3380ce5d86a17d4e71aaffbd/litellm-1.67.5-py3-none-any.whl (from https://pypi.org/simple/litellm/) (requires-python:!=2.7.*,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,>=3.8)) 66.04 litellm[proxy] 1.67.4.post1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04

Current versions

openai-agents[litellm]>=0.0.14 litellm[proxy]==1.67.4.post1
fixes?

@sumit-lightringai sumit-lightringai added the question Question about using the SDK label May 15, 2025
@sumit-lightringai
Copy link
Author

openai-agents[litellm]>=0.0.7 works with litellm[proxy] or vice versa and that version is not production quality.

@rm-openai
Copy link
Collaborator

@sumit-lightringai My first instinct is to see if we can update the mcp version in litellm proxy. thoughts? 1.8.0 adds the new MCP streamable http server

Copy link

This issue is stale because it has been open for 7 days with no activity.

@github-actions github-actions bot added the stale label May 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about using the SDK stale
Projects
None yet
Development

No branches or pull requests

2 participants