26/07/2025, 08:33 Function calling - OpenAI API
Copy page Chat Completions
Function calling
Enable models to fetch data and take actions.
Function calling provides a powerful and flexible way for OpenAI models to interface with your
code or external services. This guide will explain how to connect the models to your own
custom code to fetch data or take action.
Get weather Send email Search knowledge base
Function calling example with get_weather function python
1 from openai import OpenAI
2
3 client = OpenAI()
4
5 tools = [{
6 "type": "function",
7 "function": {
8 "name": "get_weather",
9 "description": "Get current temperature for a given location.",
10 "parameters": {
11 "type": "object",
12 "properties": {
13 "location": {
14 "type": "string",
15 "description": "City and country e.g. Bogotá, Colombia"
16 }
17 },
18 "required": [
19 "location"
20 ],
21 "additionalProperties": False
22 },
23 "strict": True
24 }
25 }]
26
27 completion = client.chat.completions.create(
28 model="gpt-4.1",
29 messages=[{"role": "user", "content": "What is the weather like in Paris
30 tools=tools
31 )
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 1/16
26/07/2025, 08:33 Function calling - OpenAI API
32
33 print(completion.choices[0].message.tool_calls)
Output
1 [{
2 "id": "call_12345xyz",
3 "type": "function",
4 "function": {
5 "name": "get_weather",
6 "arguments": "{\"location\":\"Paris, France\"}"
7 }
8 }]
Experiment with function calling and generate function schemas in the Playground!
Overview
You can give the model access to your own custom code through function calling. Based on
the system prompt and messages, the model may decide to call these functions — instead of
(or in addition to) generating text or audio.
You'll then execute the function code, send back the results, and the model will incorporate
them into its final response.
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 2/16
26/07/2025, 08:33 Function calling - OpenAI API
Function calling has two primary use cases:
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 3/16
26/07/2025, 08:33 Function calling - OpenAI API
Fetching Data Retrieve up-to-date information to incorporate into the model's response (RAG).
Useful for searching knowledge bases and retrieving specific data from APIs (e.g.
current weather data).
Taking Action Perform actions like submitting a form, calling APIs, modifying application state
(UI/frontend or backend), or taking agentic workflow actions (like handing off the
conversation).
Sample function
Let's look at the steps to allow a model to use a real get_weather function defined below:
Sample get_weather function implemented in your codebase python
1 import requests
2
3 def get_weather(latitude, longitude):
4 response = requests.get(f"https://api.open-meteo.com/v1/forecast?latitude
5 data = response.json()
6 return data['current']['temperature_2m']
Unlike the diagram earlier, this function expects precise latitude and longitude instead
of a general location parameter. (However, our models can automatically determine the
coordinates for many locations!)
Function calling steps
1 Call model with functions defined – along with your system and user messages.
Step 1: Call model with get_weather tool defined python
1 from openai import OpenAI
2 import json
3
4 client = OpenAI()
5
6 tools = [{
7 "type": "function",
8 "function": {
9 "name": "get_weather",
10 "description": "Get current temperature for provided coordinates in
11 "parameters": {
12 "type": "object",
13 "properties": {
14 "latitude": {"type": "number"},
15 "longitude": {"type": "number"}
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 4/16
26/07/2025, 08:33 Function calling - OpenAI API
16 },
17 "required": ["latitude", "longitude"],
18 "additionalProperties": False
19 },
20 "strict": True
21 }
22 }]
23
24 messages = [{"role": "user", "content": "What's the weather like in Paris to
25
26 completion = client.chat.completions.create(
27 model="gpt-4.1",
28 messages=messages,
29 tools=tools,
30 )
2 Model decides to call function(s) – model returns the name and input arguments.
completion.choices[0].message.tool_calls
1 [{
2 "id": "call_12345xyz",
3 "type": "function",
4 "function": {
5 "name": "get_weather",
6 "arguments": "{\"latitude\":48.8566,\"longitude\":2.3522}"
7 }
8 }]
3 Execute function code – parse the model's response and handle function calls.
Step 3: Execute get_weather function python
1 tool_call = completion.choices[0].message.tool_calls[0]
2 args = json.loads(tool_call.function.arguments)
3
4 result = get_weather(args["latitude"], args["longitude"])
4 Supply model with results – so it can incorporate them into its final response.
Step 4: Supply result and call model again python
1 messages.append(completion.choices[0].message) # append model's function ca
2 messages.append({ # append result message
3 "role": "tool",
4 "tool_call_id": tool_call.id,
5 "content": str(result)
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 5/16
26/07/2025, 08:33 Function calling - OpenAI API
6 })
7
8 completion_2 = client.chat.completions.create(
9 model="gpt-4.1",
10 messages=messages,
11 tools=tools,
12 )
5 Model responds – incorporating the result in its output.
completion_2.choices[0].message.content
"The current temperature in Paris is 14°C (57.2°F)."
Defining functions
Functions can be set in the tools parameter of each API request inside a function object.
A function is defined by its schema, which informs the model what it does and what input
arguments it expects. It comprises the following fields:
FIELD DESCRIPTION
name The function's name (e.g. get_weather)
description Details on when and how to use the function
parameters JSON schema defining the function's input arguments
Take a look at this example or generate your own below (or in our Playground).
1 {
2 "type": "function",
3 "function": {
4 "name": "get_weather",
5 "description": "Retrieves current weather for the given location.",
6 "parameters": {
7 "type": "object",
8 "properties": {
9 "location": {
10 "type": "string",
11 "description": "City and country e.g. Bogotá, Colombia"
12 },
13 "units": {
14 "type": "string",
15 "enum": [
16 "celsius",
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 6/16
26/07/2025, 08:33 Function calling - OpenAI API
17 "fahrenheit"
18 ],
19 "description": "Units the temperature will be returned in."
20 }
21 },
22 "required": [
23 "location",
24 "units"
25 ],
26 "additionalProperties": false
27 },
28 "strict": true
29 }
30 }
Because the parameters are defined by a JSON schema, you can leverage many of its rich
features like property types, enums, descriptions, nested objects, and, recursive objects.
(Optional) Function calling wth pydantic and zod
Best practices for defining functions
1 Write clear and detailed function names, parameter descriptions, and instructions.
Explicitly describe the purpose of the function and each parameter (and its format),
and what the output represents.
Use the system prompt to describe when (and when not) to use each function.
Generally, tell the model exactly what to do.
Include examples and edge cases, especially to rectify any recurring failures. (Note:
Adding examples may hurt performance for reasoning models.)
2 Apply software engineering best practices.
Make the functions obvious and intuitive. (principle of least surprise)
Use enums and object structure to make invalid states unrepresentable. (e.g.
toggle_light(on: bool, off: bool) allows for invalid calls)
Pass the intern test. Can an intern/human correctly use the function given nothing
but what you gave the model? (If not, what questions do they ask you? Add the
answers to the prompt.)
3 Offload the burden from the model and use code where possible.
Don't make the model fill arguments you already know. For example, if you already
have an order_id based on a previous menu, don't have an order_id param –
instead, have no params submit_refund() and pass the order_id with code.
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 7/16
26/07/2025, 08:33 Function calling - OpenAI API
Combine functions that are always called in sequence. For example, if you always
call mark_location() after query_location() , just move the marking logic into
the query function call.
4 Keep the number of functions small for higher accuracy.
Evaluate your performance with different numbers of functions.
Aim for fewer than 20 functions at any one time, though this is just a soft
suggestion.
5 Leverage OpenAI resources.
Generate and iterate on function schemas in the Playground.
Consider fine-tuning to increase function calling accuracy for large numbers of
functions or difficult tasks. (cookbook)
Token Usage
Under the hood, functions are injected into the system message in a syntax the model has
been trained on. This means functions count against the model's context limit and are billed as
input tokens. If you run into token limits, we suggest limiting the number of functions or the
length of the descriptions you provide for function parameters.
It is also possible to use fine-tuning to reduce the number of tokens used if you have many
functions defined in your tools specification.
Handling function calls
When the model calls a function, you must execute it and return the result. Since model
responses can include zero, one, or multiple calls, it is best practice to assume there are
several.
The response has an array of tool_calls , each with an id (used later to submit the
function result) and a function containing a name and JSON-encoded arguments .
Sample response with multiple function calls
1 [
2 {
3 "id": "call_12345xyz",
4 "type": "function",
5 "function": {
6 "name": "get_weather",
7 "arguments": "{\"location\":\"Paris, France\"}"
8 }
9 },
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 8/16
26/07/2025, 08:33 Function calling - OpenAI API
10 {
11 "id": "call_67890abc",
12 "type": "function",
13 "function": {
14 "name": "get_weather",
15 "arguments": "{\"location\":\"Bogotá, Colombia\"}"
16 }
17 },
18 {
19 "id": "call_99999def",
20 "type": "function",
21 "function": {
22 "name": "send_email",
23 "arguments": "{\"to\":\"bob@email.com\",\"body\":\"Hi bob\"}"
24 }
25 }
26 ]
Execute function calls and append results python
1 for tool_call in completion.choices[0].message.tool_calls:
2 name = tool_call.function.name
3 args = json.loads(tool_call.function.arguments)
4
5 result = call_function(name, args)
6 messages.append({
7 "role": "tool",
8 "tool_call_id": tool_call.id,
9 "content": str(result)
10 })
In the example above, we have a hypothetical call_function to route each call. Here’s a
possible implementation:
Execute function calls and append results python
1 def call_function(name, args):
2 if name == "get_weather":
3 return get_weather(**args)
4 if name == "send_email":
5 return send_email(**args)
Formatting results
A result must be a string, but the format is up to you (JSON, error codes, plain text, etc.). The
model will interpret that string as needed.
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 9/16
26/07/2025, 08:33 Function calling - OpenAI API
If your function has no return value (e.g. send_email ), simply return a string to indicate
success or failure. (e.g. "success" )
Incorporating results into response
After appending the results to your messages , you can send them back to the model to get a
final response.
Send results back to model python
1 completion = client.chat.completions.create(
2 model="gpt-4.1",
3 messages=messages,
4 tools=tools,
5 )
Final response
"It's about 15°C in Paris, 18°C in Bogotá, and I've sent that email to Bob."
Additional configurations
Tool choice
By default the model will determine when and how many tools to use. You can force specific
behavior with the tool_choice parameter.
1 Auto: (Default) Call zero, one, or multiple functions. tool_choice: "auto"
2 Required: Call one or more functions. tool_choice: "required"
1 Forced Function: Call exactly one specific function.
tool_choice: {"type": "function", "function": {"name": "get_weather"}}
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 10/16
26/07/2025, 08:33 Function calling - OpenAI API
You can also set tool_choice to "none" to imitate the behavior of passing no functions.
Parallel function calling
The model may choose to call multiple functions in a single turn. You can prevent this by
setting parallel_tool_calls to false , which ensures exactly zero or one tool is called.
Note: Currently, if you are using a fine tuned model and the model calls multiple functions in
one turn then strict mode will be disabled for those calls.
Note for gpt-4.1-nano-2025-04-14 : This snapshot of gpt-4.1-nano can sometimes
include multiple tools calls for the same tool if parallel tool calls are enabled. It is
recommended to disable this feature when using this nano snapshot.
Strict mode
Setting strict to true will ensure function calls reliably adhere to the function schema,
instead of being best effort. We recommend always enabling strict mode.
Under the hood, strict mode works by leveraging our structured outputs feature and therefore
introduces a couple requirements:
1 additionalProperties must be set to false for each object in the parameters .
2 All fields in properties must be marked as required .
You can denote optional fields by adding null as a type option (see example below).
Strict mode enabled Strict mode disabled
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 11/16
26/07/2025, 08:33 Function calling - OpenAI API
1 {
2 "type": "function",
3 "function": {
4 "name": "get_weather",
5 "description": "Retrieves current weather for the given location.",
6 "strict": true,
7 "parameters": {
8 "type": "object",
9 "properties": {
10 "location": {
11 "type": "string",
12 "description": "City and country e.g. Bogotá, Colombia"
13 },
14 "units": {
15 "type": ["string", "null"],
16 "enum": ["celsius", "fahrenheit"],
17 "description": "Units the temperature will be returned i
18 }
19 },
20 "required": ["location", "units"],
21 "additionalProperties": false
22 }
23 }
24 }
All schemas generated in the playground have strict mode enabled.
While we recommend you enable strict mode, it has a few limitations:
1 Some features of JSON schema are not supported. (See supported schemas.)
Specifically for fine tuned models:
1 Schemas undergo additional processing on the first request (and are then cached). If your
schemas vary from request to request, this may result in higher latencies.
2 Schemas are cached for performance, and are not eligible for zero data retention.
Streaming
Streaming can be used to surface progress by showing which function is called as the model
fills its arguments, and even displaying the arguments in real time.
Streaming function calls is very similar to streaming regular responses: you set stream to
true and get chunks with delta objects.
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 12/16
26/07/2025, 08:33 Function calling - OpenAI API
Streaming function calls python
1 from openai import OpenAI
2
3 client = OpenAI()
4
5 tools = [{
6 "type": "function",
7 "function": {
8 "name": "get_weather",
9 "description": "Get current temperature for a given location.",
10 "parameters": {
11 "type": "object",
12 "properties": {
13 "location": {
14 "type": "string",
15 "description": "City and country e.g. Bogotá, Colombia"
16 }
17 },
18 "required": ["location"],
19 "additionalProperties": False
20 },
21 "strict": True
22 }
23 }]
24
25 stream = client.chat.completions.create(
26 model="gpt-4.1",
27 messages=[{"role": "user", "content": "What's the weather like in Paris
28 tools=tools,
29 stream=True
30 )
31
32 for chunk in stream:
33 delta = chunk.choices[0].delta
34 print(delta.tool_calls)
Output delta.tool_calls
1 [{"index": 0, "id": "call_DdmO9pD3xa9XTPNJ32zg2hcA", "function": {"arguments"
2 [{"index": 0, "id": null, "function": {"arguments": "{\"", "name": null}, "ty
3 [{"index": 0, "id": null, "function": {"arguments": "location", "name": null}
4 [{"index": 0, "id": null, "function": {"arguments": "\":\"", "name": null}, "
5 [{"index": 0, "id": null, "function": {"arguments": "Paris", "name": null}, "
6 [{"index": 0, "id": null, "function": {"arguments": ",", "name": null}, "type
7 [{"index": 0, "id": null, "function": {"arguments": " France", "name": null},
8 [{"index": 0, "id": null, "function": {"arguments": "\"}", "name": null}, "ty
9 null
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 13/16
26/07/2025, 08:33 Function calling - OpenAI API
Instead of aggregating chunks into a single content string, however, you're aggregating
chunks into an encoded arguments JSON object.
When the model calls one or more functions the tool_calls field of each delta will be
populated. Each tool_call contains the following fields:
FIELD DESCRIPTION
index Identifies which function call the delta is for
id Tool call id.
function Function call delta (name and arguments)
type Type of tool_call (always function for function calls)
Many of these fields are only set for the first delta of each tool call, like id ,
function.name , and type .
Below is a code snippet demonstrating how to aggregate the delta s into a final
tool_calls object.
Accumulating tool_call deltas python
1 final_tool_calls = {}
2
3 for chunk in stream:
4 for tool_call in chunk.choices[0].delta.tool_calls or []:
5 index = tool_call.index
6
7 if index not in final_tool_calls:
8 final_tool_calls[index] = tool_call
9
10 final_tool_calls[index].function.arguments += tool_call.function.arg
Accumulated final_tool_calls[0]
1 {
2 "index": 0,
3 "id": "call_RzfkBpJgzeR0S242qfvjadNe",
4 "function": {
5 "name": "get_weather",
6 "arguments": "{\"location\":\"Paris, France\"}"
7 }
8 }
https://platform.openai.com/docs/guides/function-calling?api-mode=chat&example=get-weather 14/16