-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Insights: openai/openai-agents-python
Overview
-
- 0 Merged pull requests
- 5 Open pull requests
- 5 Closed issues
- 10 New issues
There hasn’t been any commit activity on openai/openai-agents-python in the last 3 days.
Want to help out?
5 Pull requests opened by 5 people
-
Docs: Fix Guardrail definition to include both input and output validations
#756 opened
May 25, 2025 -
Fix typo: Replace 'two' with 'three' in /docs/mcp.md
#757 opened
May 25, 2025 -
Changed a mistake
#759 opened
May 26, 2025 -
Fix and Document `parallel_tool_calls` Attribute in ModelSettings
#763 opened
May 26, 2025 -
Added support for passing tool_call_id via the RunContextWrapper
#766 opened
May 27, 2025
5 Issues closed by 3 people
-
Ba
#755 closed
May 27, 2025 -
MCP Stdio- Hosted tools are not supported with the ChatCompletions API
#761 closed
May 26, 2025 -
How to pass hardcoded dynamic messages as agent's responses in the chat history ?
#695 closed
May 26, 2025 -
litellm proxy with openai-agents
#696 closed
May 26, 2025 -
How to pass reasoning=generate_summary for ComputerTool Agent?
#698 closed
May 26, 2025
10 Issues opened by 10 people
-
Bug: Pydantic Warnings and `.final_output` Serialization Issue with Code Interpreter File Citations
#772 opened
May 28, 2025 -
Only 1 handoff getting called no matter what
#771 opened
May 28, 2025 -
function_tool calling - agents orchertratins
#769 opened
May 27, 2025 -
Agents Reproducibility - seed ? top_p=0?
#768 opened
May 27, 2025 -
How to dynamically add/remove tools in a `tool_use_behavior="run_llm_again"`-loop
#767 opened
May 27, 2025 -
Tool calling with LiteLLM and thinking models fail
#765 opened
May 27, 2025 -
How to implement handoff between parents and children agents in different paths
#764 opened
May 27, 2025 -
parallel_tool_calls default is True in the client, despite docs saying False
#762 opened
May 26, 2025 -
ValidationError` from `InputTokensDetails` when using `LitellmModel` with `None` cached\_tokens
#760 opened
May 26, 2025 -
LiteLLM + Gemini 2.5 Pro: cached_tokens=None crashes Agents SDK with Pydantic int-validation error
#758 opened
May 26, 2025
16 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
Added RunErrorDetails object for MaxTurnsExceeded exception
#743 commented on
May 27, 2025 • 1 new comment -
How to pass the control back to triage agent
#714 commented on
May 26, 2025 • 0 new comments -
Cannot get the last tool_call_output event in stream_events when MaxTurnsExceeded
#526 commented on
May 26, 2025 • 0 new comments -
Enhance `on_tool_start` Hook to Include Tool Call Arguments
#252 commented on
May 26, 2025 • 0 new comments -
Add Session Memory
#745 commented on
May 26, 2025 • 0 new comments -
Issue with Multi-turn Handoff and Slot Filling
#741 commented on
May 27, 2025 • 0 new comments -
How to provide resources with a MCP server?
#676 commented on
May 27, 2025 • 0 new comments -
Human-In-The-Loop Architecture should be implemented on top priority!
#636 commented on
May 27, 2025 • 0 new comments -
How to use openai-agents to call text2image API?
#725 commented on
May 28, 2025 • 0 new comments -
Fix typos in documentation and event naming across multiple files
#651 commented on
May 26, 2025 • 0 new comments -
Fixed Python syntax
#665 commented on
May 27, 2025 • 0 new comments -
docs: Fix Guardrail intro to include output validation
#697 commented on
May 26, 2025 • 0 new comments -
Fix incorrect and ambiguous guardrail documentation: Input/Output Guardrails and agent sequencing
#699 commented on
May 26, 2025 • 0 new comments -
docs: Fix context example syntax and improve clarity
#700 commented on
May 26, 2025 • 0 new comments -
docs: Enhance fetch_user_age tool: include UID and correct output format
#702 commented on
May 26, 2025 • 0 new comments -
Create The moon dev
#709 commented on
May 28, 2025 • 0 new comments