-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Parallel Tool Calling: Q/Bug? #791
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Do you have an example? Just to clarify, parallel_tool_calling means the model can produce N>1 tool calls in a single response |
I think what we were looking for was: 1st Responses API to yield multiple tool call, multiple tools being called, then another responses api for result generation. Right now we see: Responses |
Yeah what you described is indeed what parallel tool calling should do. Do you have any sample code you can share? |
Have you tried to change provider? Do check with openai models. |
We are using gpt-4.1 |
Happy to help once you are able to share more details on the prompt, tools list, input etc. |
from datetime import datetime
from agents import Agent, Runner, function_tool, ModelSettings, trace, RunContextWrapper, WebSearchTool
@function_tool(description_override="Get current date and time.")
async def get_current_time(ctx: RunContextWrapper) -> str:
return datetime.now().strftime("%Y-%m-%d %H:%M:%S")
@function_tool(description_override="Get the current weather")
async def get_current_weather(ctx: RunContextWrapper) -> int:
return "It's sunny and warm."
agent = Agent(
name="Assistant",
model="gpt-4.1",
instructions="You are a helpful assistant",
tools=[get_current_time, get_current_weather, WebSearchTool(search_context_size="medium")],
model_settings=ModelSettings(
parallel_tool_calls=True, temperature=0.0, tool_choice="auto"
),
)
with trace(
workflow_name="openai-agent-example",
):
result = Runner.run_sync(agent, "WGive me the current time and current weather")
print(result.final_output) I tested with and without WebSearchTool. With, it is not doing parallel execution and without it is. |
Ah yes. Parallel tool calling is disabled when you enable hosted tools. We're hoping to enable it in the future! |
I see! For code interpreter it seem to work fine though |
I think that might be an oversight on our part |
You can leave it like this 😂 |
Question: I can't seem to figure out how to get agents to call tools in parallel instead of sequentially. Tried setting
parallel_tool_calling
param = True in model_settings but no luck (functions are in strict mode). What am I missing I'm not finding any clear doc. for this.Debug:
openai-agents==0.0.16
Python v. == 3.12.2
The text was updated successfully, but these errors were encountered: