Skip to content

Parallel Tool Calling: Q/Bug? #791

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
conpaine109 opened this issue May 30, 2025 · 11 comments
Open

Parallel Tool Calling: Q/Bug? #791

conpaine109 opened this issue May 30, 2025 · 11 comments
Labels
needs-more-info Waiting for a reply/more info from the author question Question about using the SDK

Comments

@conpaine109
Copy link

Question: I can't seem to figure out how to get agents to call tools in parallel instead of sequentially. Tried setting parallel_tool_calling param = True in model_settings but no luck (functions are in strict mode). What am I missing I'm not finding any clear doc. for this.

Debug:
openai-agents==0.0.16
Python v. == 3.12.2

@conpaine109 conpaine109 added the question Question about using the SDK label May 30, 2025
@rm-openai
Copy link
Collaborator

Do you have an example? Just to clarify, parallel_tool_calling means the model can produce N>1 tool calls in a single response

@grillorafael
Copy link

I think what we were looking for was:

1st Responses API to yield multiple tool call, multiple tools being called, then another responses api for result generation.

Right now we see:

Responses
Tool
Responses
Tool
Responses

@rm-openai
Copy link
Collaborator

Yeah what you described is indeed what parallel tool calling should do. Do you have any sample code you can share?

@Rehan-Ul-Haq
Copy link
Contributor

Question: I can't seem to figure out how to get agents to call tools in parallel instead of sequentially. Tried setting parallel_tool_calling param = True in model_settings but no luck (functions are in strict mode). What am I missing I'm not finding any clear doc. for this.

Debug: openai-agents==0.0.16 Python v. == 3.12.2

Have you tried to change provider? Do check with openai models.
Also read #763

@grillorafael
Copy link

Question: I can't seem to figure out how to get agents to call tools in parallel instead of sequentially. Tried setting parallel_tool_calling param = True in model_settings but no luck (functions are in strict mode). What am I missing I'm not finding any clear doc. for this.
Debug: openai-agents==0.0.16 Python v. == 3.12.2

Have you tried to change provider? Do check with openai models. Also read #763

We are using gpt-4.1

@rm-openai rm-openai added the needs-more-info Waiting for a reply/more info from the author label Jun 2, 2025
@rm-openai
Copy link
Collaborator

Happy to help once you are able to share more details on the prompt, tools list, input etc.

@grillorafael
Copy link

from datetime import datetime
from agents import Agent, Runner, function_tool, ModelSettings, trace, RunContextWrapper, WebSearchTool


@function_tool(description_override="Get current date and time.")
async def get_current_time(ctx: RunContextWrapper) -> str:
    return datetime.now().strftime("%Y-%m-%d %H:%M:%S")


@function_tool(description_override="Get the current weather")
async def get_current_weather(ctx: RunContextWrapper) -> int:
    return "It's sunny and warm."


agent = Agent(
    name="Assistant",
    model="gpt-4.1",
    instructions="You are a helpful assistant",
    tools=[get_current_time, get_current_weather, WebSearchTool(search_context_size="medium")],
    model_settings=ModelSettings(
        parallel_tool_calls=True, temperature=0.0, tool_choice="auto"
    ),
)


with trace(
    workflow_name="openai-agent-example",
):
    result = Runner.run_sync(agent, "WGive me the current time and current weather")
    print(result.final_output)

I tested with and without WebSearchTool.

With, it is not doing parallel execution and without it is.

Image

Image

@rm-openai
Copy link
Collaborator

Ah yes. Parallel tool calling is disabled when you enable hosted tools. We're hoping to enable it in the future!

@grillorafael
Copy link

I see! For code interpreter it seem to work fine though

@rm-openai
Copy link
Collaborator

I think that might be an oversight on our part

@grillorafael
Copy link

You can leave it like this 😂

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-more-info Waiting for a reply/more info from the author question Question about using the SDK
Projects
None yet
Development

No branches or pull requests

4 participants