-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Insights: openai/openai-agents-python
Overview
Could not load contribution data
Please try again later
1 Release published by 1 person
-
v0.0.16
published
May 21, 2025
11 Pull requests merged by 5 people
-
Fix Gemini API content filter handling
#746 merged
May 23, 2025 -
Update MCP and tool docs
#736 merged
May 23, 2025 -
Fix visualization recursion with cycle detection
#737 merged
May 23, 2025 -
fix Gemini token validation issue with LiteLLM
#735 merged
May 21, 2025 -
v0.0.16
#733 merged
May 21, 2025 -
Add support for local shell, image generator, code interpreter tools
#732 merged
May 21, 2025 -
Hosted MCP support
#731 merged
May 21, 2025 -
Upgrade openAI sdk version
#730 merged
May 21, 2025 -
Add ID generation methods to TraceProvider
#729 merged
May 21, 2025 -
DRAFT: Dev/add usage details to Usage class
#726 merged
May 20, 2025 -
Add Galileo to external tracing processors list
#662 merged
May 19, 2025
11 Pull requests opened by 11 people
-
Make Runner an abstract base class
#720 opened
May 20, 2025 -
docs: fix output guardrail step description
#724 opened
May 20, 2025 -
add get_tool_call_name method to extract tool name from ToolCallItem
#728 opened
May 21, 2025 -
Jules wip 10459775938551966727
#739 opened
May 22, 2025 -
fix(generation_span): Include `input` when `tracing.include_data()`.
#742 opened
May 22, 2025 -
Added RunErrorDetails object for MaxTurnsExceeded exception
#743 opened
May 22, 2025 -
Attach partial run data on errors
#747 opened
May 23, 2025 -
remove "can can" with single can
#751 opened
May 24, 2025 -
Add Sessions for Automatic Conversation History Management
#752 opened
May 24, 2025 -
Docs: Fix Guardrail definition to include both input and output validations
#756 opened
May 25, 2025 -
Fix typo: Replace 'two' with 'three' in /docs/mcp.md
#757 opened
May 25, 2025
15 Issues closed by 5 people
-
How to pass hardcoded dynamic messages as agent's responses in the chat history ?
#695 closed
May 26, 2025 -
litellm proxy with openai-agents
#696 closed
May 26, 2025 -
How to pass reasoning=generate_summary for ComputerTool Agent?
#698 closed
May 26, 2025 -
Multiple handoffs requested Error tracing platform
#694 closed
May 25, 2025 -
Troubleshooting Agent Handoff in Multi-Agent Workflow
#681 closed
May 24, 2025 -
Infinite recursion in src/agents/extensions/visualization.py due to circular references
#668 closed
May 23, 2025 -
Streaming chain of thoughts
#721 closed
May 22, 2025 -
When using Japanese in AzureOpenAI, answers may not be displayed
#649 closed
May 22, 2025 -
Threads api
#674 closed
May 22, 2025 -
LiteLLM with Gemini token issue
#734 closed
May 21, 2025 -
Passing images as input to an agent
#727 closed
May 21, 2025 -
llm.txt?
#670 closed
May 20, 2025 -
How to make a LiteLLM models run in Reasoning mode
#671 closed
May 20, 2025 -
Add an example for telephony voice agent
#672 closed
May 20, 2025 -
MCP Server instructions are currently ignored
#704 closed
May 19, 2025
18 Issues opened by 17 people
-
LiteLLM + Gemini 2.5 Pro: cached_tokens=None crashes Agents SDK with Pydantic int-validation error
#758 opened
May 26, 2025 -
Ba
#755 opened
May 25, 2025 -
Streaming Output issue got unnecesory text in output
#753 opened
May 24, 2025 -
support for bedrock prompt caching
#750 opened
May 24, 2025 -
Will session memory be implemented in the future?
#748 opened
May 23, 2025 -
Add Session Memory
#745 opened
May 22, 2025 -
Issue with Multi-turn Handoff and Slot Filling
#741 opened
May 22, 2025 -
Can't implement guardrails with litellm
#740 opened
May 22, 2025 -
How to integrate audio input in agent sdk? (not realtime or voice pipeline, e.g., gpt-4o audio)
#738 opened
May 22, 2025 -
How to use openai-agents to call text2image API?
#725 opened
May 20, 2025 -
Unable to use tool calling when using Llama 4 scout via LiteLLM Proxy
#723 opened
May 20, 2025 -
How would I handoff a non-reasoning model with tool calls to a reasoning model?
#722 opened
May 20, 2025 -
Make intermediate results available when `MaxTurnExceededException` is thrown
#719 opened
May 20, 2025 -
Very high response times at random during hand-offs
#717 opened
May 19, 2025 -
Allow agent to return logprobs
#715 opened
May 19, 2025 -
How to pass the control back to triage agent
#714 opened
May 19, 2025
28 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
How to provide resources with a MCP server?
#676 commented on
May 19, 2025 • 0 new comments -
Why does the Computer protocol not have the goto method?
#547 commented on
May 19, 2025 • 0 new comments -
Make FuncTool and @function_tool decorated function callable
#708 commented on
May 20, 2025 • 0 new comments -
Proper way of managing large context window for ComputerTool
#111 commented on
May 20, 2025 • 0 new comments -
Support for OpenAI agents sdk with Javascript/Typescript
#240 commented on
May 21, 2025 • 0 new comments -
Input format in agent as tool
#655 commented on
May 22, 2025 • 0 new comments -
How to make the conversation finally back to the MAIN AGENT
#527 commented on
May 22, 2025 • 0 new comments -
from agents.extensions.models.litellm_model import LitellmModel
#666 commented on
May 22, 2025 • 0 new comments -
Support for MCP prompts and resources
#544 commented on
May 22, 2025 • 0 new comments -
PyCharm Fails to Recognize Agent Constructor Arguments (v0.0.15)
#712 commented on
May 24, 2025 • 0 new comments -
Add Support for Image Return in Agent Tools
#341 commented on
May 24, 2025 • 0 new comments -
Tracing client error 400 (MCP)
#710 commented on
May 25, 2025 • 0 new comments -
Human-In-The-Loop Architecture should be implemented on top priority!
#636 commented on
May 25, 2025 • 0 new comments -
Add tool call parameters for `on_tool_start` hook
#253 commented on
May 23, 2025 • 0 new comments -
Add a new GH Actions job to automatically update translated document pagse
#598 commented on
May 21, 2025 • 0 new comments -
[MCP][Utils] Add support for FastMCP processing
#631 commented on
May 20, 2025 • 0 new comments -
fix: add ensure_ascii=False to json.dumps for correct Unicode output
#639 commented on
May 25, 2025 • 0 new comments -
feat: Add support for image function tools
#654 commented on
May 19, 2025 • 0 new comments -
Fixed Python syntax
#665 commented on
May 25, 2025 • 0 new comments -
feat: storage adapter (bubble / supabase)
#669 commented on
May 20, 2025 • 0 new comments -
Use `max_completion_tokens` param for OpenAI Chat Completion API
#679 commented on
May 22, 2025 • 0 new comments -
Added response cost in the Usage
#682 commented on
May 23, 2025 • 0 new comments -
Feature/message_filter
#687 commented on
May 23, 2025 • 0 new comments -
docs: add DeepWiki badge for AI-powered project documentation
#689 commented on
May 24, 2025 • 0 new comments -
docs: Fix Guardrail intro to include output validation
#697 commented on
May 26, 2025 • 0 new comments -
Fix incorrect and ambiguous guardrail documentation: Input/Output Guardrails and agent sequencing
#699 commented on
May 26, 2025 • 0 new comments -
docs: Fix context example syntax and improve clarity
#700 commented on
May 26, 2025 • 0 new comments -
docs: Enhance fetch_user_age tool: include UID and correct output format
#702 commented on
May 26, 2025 • 0 new comments