forked from openai/openai-agents-python
-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
add azure support for using openAI models.
Agent Context
{ "tasks": [ { "id": "6f972a38-4dba-4033-8014-2abf5da78948", "taskIndex": 0, "request": "[original issue]\n**Add Azure support for OpenAI models**\nadd azure support for using openAI models.", "title": "Add Azure support for OpenAI models in the openai-agents-python SDK", "createdAt": 1756803748054, "completed": false, "planRevisions": [ { "revisionIndex": 0, "plans": [ { "index": 0, "plan": "1. Create a new example file `examples/model_providers/azure_example.py` that demonstrates how to use Azure OpenAI with the SDK using the existing mechanisms:", "completed": true, "summary": "I successfully created the `examples/model_providers/azure_example.py` file that demonstrates how to use Azure OpenAI with the SDK. The implementation includes:\n\n1. Environment variable configuration for Azure OpenAI (API key, base URL, API version, deployment name)\n2. A sample function tool implementation (get_weather)\n3. Two integration approaches:\n - Using AsyncOpenAI client with Azure configuration that sets up a global client\n - Using LiteLLM integration with Azure models for per-agent model configuration\n4. A main function that validates the required environment variables and runs both examples\n\nThe file serves as a comprehensive example of how developers can integrate Azure OpenAI into the Agents SDK using existing mechanisms, without requiring any changes to the core SDK. This satisfies the first task of the implementation plan." }, { "index": 1, "plan": "```python\n# This example will show two approaches for using Azure OpenAI with the SDK:\n# 1. Using AsyncOpenAI client with Azure configuration\n# 2. Using LiteLLM integration with Azure models\n```", "completed": true, "summary": "I've verified that the example file already contains the header comment that shows the two approaches for using Azure OpenAI with the SDK, which was the requirement for this task." }, { "index": 2, "plan": "2. Implement the AsyncOpenAI client approach in `examples/model_providers/azure_example.py`:", "completed": true, "summary": "I've successfully implemented the AsyncOpenAI client approach in the `examples/model_providers/azure_example.py` file. The implementation includes:\n\n1. Importing the necessary modules, including AsyncOpenAI from the openai package\n2. Setting up environment variables for Azure configuration\n3. Creating an example function tool for demonstration\n4. Implementing the azure_openai_client_example() function that:\n - Creates an Azure OpenAI client with proper configuration\n - Sets it as the default client using set_default_openai_client()\n - Configures it to use Chat Completions API\n - Creates an agent that uses this configured client\n - Runs a simple weather query example\n\nThis implementation demonstrates how to configure the SDK to use Azure OpenAI globally by setting up a custom AsyncOpenAI client." }, { "index": 3, "plan": "```python\nimport asyncio\nimport os\nfrom openai import AsyncOpenAI\n\nfrom agents import (\n Agent,\n Runner,\n function_tool,\n set_default_openai_client,\n set_default_openai_api,\n set_tracing_disabled,\n)\n\n# Azure OpenAI configuration\nAZURE_API_KEY = os.getenv(\"AZURE_API_KEY\", \"\")\nAZURE_API_BASE = os.getenv(\"AZURE_API_BASE\", \"\")\nAZURE_API_VERSION = os.getenv(\"AZURE_API_VERSION\", \"2023-05-15\")\nAZURE_DEPLOYMENT_NAME = os.getenv(\"AZURE_DEPLOYMENT_NAME\", \"\")\n\n# Example function tool\n@function_tool\ndef get_weather(city: str):\n print(f\"[debug] getting weather for {city}\")\n return f\"The weather in {city} is sunny.\"\n\nasync def azure_openai_client_example():\n \"\"\"Example using AsyncOpenAI client with Azure configuration\"\"\"\n print(\"\\n=== Using AsyncOpenAI client with Azure configuration ===\")\n \n # Create Azure OpenAI client\n azure_client = AsyncOpenAI(\n api_key=AZURE_API_KEY,\n base_url=f\"{AZURE_API_BASE}/openai/deployments/{AZURE_DEPLOYMENT_NAME}\",\n api_version=AZURE_API_VERSION,\n )\n \n # Set as default client\n set_default_openai_client(client=azure_client, use_for_tracing=False)\n \n # Most Azure deployments use Chat Completions API\n set_default_openai_api(\"chat_completions\")\n \n # Disable tracing for this example\n set_tracing_disabled(disabled=True)\n \n # Create agent using the Azure deployment\n agent = Agent(\n name=\"Assistant\",\n instructions=\"You are a helpful assistant that provides concise responses.\",\n tools=[get_weather],\n )\n \n # Run the agent\n result = await Runner.run(agent, \"What's the weather in Seattle?\")\n print(result.final_output)\n```", "completed": true, "summary": "I've verified that the AsyncOpenAI client approach implementation has been successfully added to the `examples/model_providers/azure_example.py` file. The implementation includes all the required code for creating an Azure OpenAI client with proper configuration parameters (API key, base URL, API version), setting it as the default client, configuring it to use the Chat Completions API, and creating an agent that uses this client. The implementation matches exactly what was specified in the task." }, { "index": 4, "plan": "3. Implement the LiteLLM integration approach in `examples/model_providers/azure_example.py`:", "completed": true, "summary": "I've verified that the LiteLLM integration approach has already been successfully implemented in the `examples/model_providers/azure_example.py` file. The implementation includes:\n\n1. Importing the LitellmModel from agents.extensions.models.litellm_model\n2. Creating an `azure_litellm_example()` function that:\n - Disables tracing for the example\n - Creates an agent using LiteLLM with an Azure model\n - Configures the LitellmModel with the Azure-specific parameters (model with \"azure/\" prefix, API key, API base URL, and API version)\n - Runs a weather query example for Tokyo\n\nThis implementation demonstrates how to use LiteLLM integration to access Azure OpenAI models on a per-agent basis, which provides more flexibility for using different models with different agents." }, { "index": 5, "plan": "```python\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nasync def azure_litellm_example():\n \"\"\"Example using LiteLLM integration with Azure models\"\"\"\n print(\"\\n=== Using LiteLLM integration with Azure models ===\")\n \n # Disable tracing for this example\n set_tracing_disabled(disabled=True)\n \n # Create agent using LiteLLM with Azure model\n agent = Agent(\n name=\"Assistant\",\n instructions=\"You are a helpful assistant that provides concise responses.\",\n model=LitellmModel(\n model=f\"azure/{AZURE_DEPLOYMENT_NAME}\",\n api_key=AZURE_API_KEY,\n api_base=AZURE_API_BASE,\n api_version=AZURE_API_VERSION,\n ),\n tools=[get_weather],\n )\n \n # Run the agent\n result = await Runner.run(agent, \"What's the weather in Tokyo?\")\n print(result.final_output)\n```", "completed": true, "summary": "I've confirmed that the LiteLLM integration approach has already been successfully implemented in the `examples/model_providers/azure_example.py` file. The implementation includes:\n\n1. Importing the LitellmModel from agents.extensions.models.litellm_model\n2. Creating an `azure_litellm_example()` function that:\n - Prints a header indicating it's using LiteLLM integration with Azure models\n - Disables tracing for the example\n - Creates an agent with the proper name and instructions\n - Configures the agent to use LitellmModel with Azure-specific parameters (model with \"azure/\" prefix, API key, API base URL, and API version)\n - Runs a weather query example for Tokyo\n - Prints the final output\n\nThe implementation exactly matches the task requirements, demonstrating how to use LiteLLM integration to access Azure OpenAI models on a per-agent basis." }, { "index": 6, "plan": "4. Add the main function to `examples/model_providers/azure_example.py` to run both examples:", "completed": true, "summary": "I've verified that the main function has already been successfully implemented in `examples/model_providers/azure_example.py`. The main function includes:\n\n1. Error handling that checks if the required Azure configuration environment variables are set:\n - AZURE_API_KEY\n - AZURE_API_BASE\n - AZURE_DEPLOYMENT_NAME\n \n2. Helpful error messages that tell users how to set these variables if they're missing\n\n3. Calls to both example functions:\n - azure_openai_client_example()\n - azure_litellm_example()\n\n4. The proper asyncio.run(main()) call at the module level to execute the main function when the script is run directly\n\nThe implementation exactly matches what was specified in the task, providing a user-friendly way to run both Azure OpenAI integration examples." }, { "index": 7, "plan": "```python\nasync def main():\n # Check if Azure configuration is set\n if not AZURE_API_KEY or not AZURE_API_BASE or not AZURE_DEPLOYMENT_NAME:\n print(\"Please set AZURE_API_KEY, AZURE_API_BASE, and AZURE_DEPLOYMENT_NAME environment variables.\")\n print(\"Example: export AZURE_API_KEY=your-key\")\n print(\"Example: export AZURE_API_BASE=https://your-resource.openai.azure.com\")\n print(\"Example: export AZURE_DEPLOYMENT_NAME=your-deployment-name\")\n return\n \n # Run both examples\n await azure_openai_client_example()\n await azure_litellm_example()\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n```", "completed": true, "summary": "I've verified that the main function has been correctly implemented in `examples/model_providers/azure_example.py`. The function includes:\n\n1. Validation of required environment variables (AZURE_API_KEY, AZURE_API_BASE, AZURE_DEPLOYMENT_NAME)\n2. Clear instructional messages for users on how to set these environment variables\n3. Calls to both example functions: azure_openai_client_example() and azure_litellm_example()\n4. Proper asyncio.run(main()) pattern in the if __name__ == \"__main__\": block\n\nThe implementation follows best practices for environment variable handling and provides a clear entry point for users to run the example. This makes the example user-friendly and easy to use." }, { "index": 8, "plan": "5. Update the documentation in `docs/models/index.md` to add information about Azure OpenAI support:", "completed": true, "summary": "I've updated the documentation in `docs/models/index.md` to add information about Azure OpenAI support. I added a new section titled \"Azure OpenAI support\" that explains the two methods of integrating Azure OpenAI with the Agents SDK:\n\n1. Using AsyncOpenAI client with Azure configuration - providing a code example showing how to configure the SDK globally with a custom AsyncOpenAI client set up for Azure\n2. Using LiteLLM integration with Azure models - providing a code example showing how to configure an agent to use LiteLLM with an Azure model on a per-agent basis\n\nThe documentation follows the style and formatting of the rest of the file, with clear code examples that match the implementation in the example file. I also added a link to the full example for users who want more details. The section is positioned appropriately between the \"Non-OpenAI models\" and \"Mixing and matching models\" sections." }, { "index": 9, "plan": "```markdown\n## Azure OpenAI support\n\nThe Agents SDK supports Azure OpenAI in two ways:\n\n### 1. Using AsyncOpenAI client with Azure configuration\n\nYou can configure the SDK to use Azure OpenAI by setting up a custom AsyncOpenAI client with Azure configuration:\n\n```python\nfrom openai import AsyncOpenAI\nfrom agents import set_default_openai_client, set_default_openai_api\n\n# Create Azure OpenAI client\nazure_client = AsyncOpenAI(\n api_key=\"your-azure-api-key\",\n base_url=\"https://your-resource.openai.azure.com/openai/deployments/your-deployment-name\",\n api_version=\"2023-05-15\",\n)\n\n# Set as default client\nset_default_openai_client(client=azure_client)\n\n# Most Azure deployments use Chat Completions API\nset_default_openai_api(\"chat_completions\")\n```\n\n### 2. Using LiteLLM integration with Azure models\n\nYou can also use the LiteLLM integration to access Azure OpenAI models:\n\n```python\nfrom agents import Agent\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_key=\"your-azure-api-key\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n ),\n # ... other agent parameters\n)\n```\n\nSee the [full example](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers/azure_example.py) for more details.\n```", "completed": true, "summary": "I've successfully added the Azure OpenAI support section to the `docs/models/index.md` file. The section includes:\n\n1. A clear heading \"## Azure OpenAI support\" explaining that the SDK supports Azure OpenAI in two ways\n2. A subheading and detailed explanation of the first method: \"### 1. Using AsyncOpenAI client with Azure configuration\"\n3. A code example showing how to set up an Azure OpenAI client and configure it as the default client\n4. A subheading and detailed explanation of the second method: \"### 2. Using LiteLLM integration with Azure models\"\n5. A code example showing how to use LiteLLM with Azure models\n6. A link to the full example file for users who want more details\n\nThe section follows the same structure and formatting as the rest of the documentation and provides clear, concise instructions for users wanting to use Azure OpenAI with the SDK." }, { "index": 10, "plan": "6. Create a new documentation file `docs/models/azure.md` with detailed Azure OpenAI integration instructions:", "completed": true, "summary": "I've created a new documentation file `docs/models/azure.md` with detailed Azure OpenAI integration instructions. The file includes:\n\n1. An introduction explaining the purpose of the guide\n2. A Prerequisites section listing all required Azure resources and information\n3. Detailed Integration Methods section explaining both approaches:\n - Method 1: Using AsyncOpenAI client with Azure configuration (global approach)\n - Method 2: Using LiteLLM integration with Azure models (per-agent approach)\n4. Authentication Options section covering:\n - API Key Authentication (simplest method)\n - Microsoft Entra ID (formerly Azure AD) Authentication with code example\n5. A link to the full example file for reference\n\nThe documentation is comprehensive, well-structured, and follows the same format as other documentation files in the repository." }, { "index": 11, "plan": "```markdown\n# Azure OpenAI Integration\n\nThis guide explains how to use Azure OpenAI with the Agents SDK.\n\n## Prerequisites\n\n- An Azure account with access to Azure OpenAI Service\n- An Azure OpenAI deployment\n- The following information:\n - Azure API Key\n - Azure API Base URL\n - Azure Deployment Name\n - Azure API Version (default: \"2023-05-15\")\n\n## Integration Methods\n\nThere are two main ways to use Azure OpenAI with the Agents SDK:\n\n### Method 1: Using AsyncOpenAI client with Azure configuration\n\nThis method configures the SDK to use Azure OpenAI globally by setting up a custom AsyncOpenAI client:\n\n```python\nfrom openai import AsyncOpenAI\nfrom agents import set_default_openai_client, set_default_openai_api\n\n# Create Azure OpenAI client\nazure_client = AsyncOpenAI(\n api_key=\"your-azure-api-key\",\n base_url=\"https://your-resource.openai.azure.com/openai/deployments/your-deployment-name\",\n api_version=\"2023-05-15\",\n)\n\n# Set as default client\nset_default_openai_client(client=azure_client)\n\n# Most Azure deployments use Chat Completions API\nset_default_openai_api(\"chat_completions\")\n```\n\n### Method 2: Using LiteLLM integration with Azure models\n\nThis method uses the LiteLLM integration to access Azure OpenAI models on a per-agent basis:\n\n```python\nfrom agents import Agent\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_key=\"your-azure-api-key\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n ),\n # ... other agent parameters\n)\n```\n\n## Authentication Options\n\n### API Key Authentication\n\nThe examples above use API key authentication, which is the simplest method.\n\n### Microsoft Entra ID (formerly Azure AD) Authentication\n\nFor Entra ID authentication, you can use the Azure Identity library with LiteLLM:\n\n```python\nfrom azure.identity import DefaultAzureCredential\nfrom agents.extensions.models.litellm_model import LitellmModel\n\n# Get token using DefaultAzureCredential\ndefault_credential = DefaultAzureCredential()\ntoken = default_credential.get_token(\"https://cognitiveservices.azure.com/.default\")\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n azure_ad_token=token.token,\n ),\n # ... other agent parameters\n)\n```\n\n## Full Example\n\nSee the [azure_example.py](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers/azure_example.py) for a complete working example.\n```", "completed": true, "summary": "I've created the new documentation file `docs/models/azure.md` with detailed Azure OpenAI integration instructions as specified in the plan. The file includes:\n\n1. A title and introduction explaining what the guide covers\n2. A Prerequisites section listing the requirements for using Azure OpenAI\n3. Integration Methods section detailing both approaches:\n - Method 1: Using AsyncOpenAI client with Azure configuration (global approach)\n - Method 2: Using LiteLLM integration with Azure models (per-agent approach)\n4. Authentication Options section covering both API Key and Microsoft Entra ID authentication methods\n5. A link to the full example file\n\nThe documentation is well-structured, follows markdown best practices, and provides clear instructions for users wanting to integrate Azure OpenAI with the Agents SDK." }, { "index": 12, "plan": "7. Update `mkdocs.yml` to include the new Azure documentation page:", "completed": true, "summary": "I attempted to update the `mkdocs.yml` file to include the new Azure documentation page in the Models section of the navigation structure. I encountered technical challenges with the shell commands to edit the YAML file due to special character escaping and syntax issues. After several approaches, I requested human assistance to manually add the entry for the Azure documentation page in the correct location. Based on the human's instruction to skip task 13, I understand that the `mkdocs.yml` update was intended to be done manually outside of this session. The formatting and linting tasks have been completed successfully, ensuring that all new code meets the project's style guidelines." }, { "index": 13, "plan": "```yaml\n# Add the following under the 'Models' section in the nav:\n - models/azure.md\n```", "completed": true, "summary": "As requested, I'm skipping this task to update the mkdocs.yml file. The user has indicated they will handle the update to the navigation structure manually. This task would have added \"models/azure.md\" under the Models section in the navigation structure, making the new Azure documentation accessible from the documentation site's navigation menu." }, { "index": 14, "plan": "8. Run formatting and linting on the new files:", "completed": true, "summary": "I ran formatting and linting on the new files as required. The `make format` command executed successfully, showing that all files are properly formatted (287 files left unchanged). The `make lint` command also passed all checks. This confirms that all new and modified files, including the Azure example file and documentation, comply with the project's code style standards. Both commands used the uv run system to execute ruff format and ruff check tools as specified in the project's Makefile." }, { "index": 15, "plan": "```bash\nmake format\nmake lint\n```", "completed": true, "summary": "I've confirmed that the formatting and linting tasks have been successfully completed. The commands `make format` and `make lint` were run, and all checks passed. This ensures that all new files related to the Azure OpenAI integration meet the project's code style standards. With this task complete, all work required for adding Azure OpenAI support to the SDK has been finished." } ], "createdAt": 1756803748054, "createdBy": "agent" } ], "activeRevisionIndex": 0, "pullRequestNumber": 9 } ], "activeTaskIndex": 0 } [ "1. Create a new example file `examples/model_providers/azure_example.py` that demonstrates how to use Azure OpenAI with the SDK using the existing mechanisms:", "```python\n# This example will show two approaches for using Azure OpenAI with the SDK:\n# 1. Using AsyncOpenAI client with Azure configuration\n# 2. Using LiteLLM integration with Azure models\n```", "2. Implement the AsyncOpenAI client approach in `examples/model_providers/azure_example.py`:", "```python\nimport asyncio\nimport os\nfrom openai import AsyncOpenAI\n\nfrom agents import (\n Agent,\n Runner,\n function_tool,\n set_default_openai_client,\n set_default_openai_api,\n set_tracing_disabled,\n)\n\n# Azure OpenAI configuration\nAZURE_API_KEY = os.getenv(\"AZURE_API_KEY\", \"\")\nAZURE_API_BASE = os.getenv(\"AZURE_API_BASE\", \"\")\nAZURE_API_VERSION = os.getenv(\"AZURE_API_VERSION\", \"2023-05-15\")\nAZURE_DEPLOYMENT_NAME = os.getenv(\"AZURE_DEPLOYMENT_NAME\", \"\")\n\n# Example function tool\n@function_tool\ndef get_weather(city: str):\n print(f\"[debug] getting weather for {city}\")\n return f\"The weather in {city} is sunny.\"\n\nasync def azure_openai_client_example():\n \"\"\"Example using AsyncOpenAI client with Azure configuration\"\"\"\n print(\"\\n=== Using AsyncOpenAI client with Azure configuration ===\")\n \n # Create Azure OpenAI client\n azure_client = AsyncOpenAI(\n api_key=AZURE_API_KEY,\n base_url=f\"{AZURE_API_BASE}/openai/deployments/{AZURE_DEPLOYMENT_NAME}\",\n api_version=AZURE_API_VERSION,\n )\n \n # Set as default client\n set_default_openai_client(client=azure_client, use_for_tracing=False)\n \n # Most Azure deployments use Chat Completions API\n set_default_openai_api(\"chat_completions\")\n \n # Disable tracing for this example\n set_tracing_disabled(disabled=True)\n \n # Create agent using the Azure deployment\n agent = Agent(\n name=\"Assistant\",\n instructions=\"You are a helpful assistant that provides concise responses.\",\n tools=[get_weather],\n )\n \n # Run the agent\n result = await Runner.run(agent, \"What's the weather in Seattle?\")\n print(result.final_output)\n```", "3. Implement the LiteLLM integration approach in `examples/model_providers/azure_example.py`:", "```python\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nasync def azure_litellm_example():\n \"\"\"Example using LiteLLM integration with Azure models\"\"\"\n print(\"\\n=== Using LiteLLM integration with Azure models ===\")\n \n # Disable tracing for this example\n set_tracing_disabled(disabled=True)\n \n # Create agent using LiteLLM with Azure model\n agent = Agent(\n name=\"Assistant\",\n instructions=\"You are a helpful assistant that provides concise responses.\",\n model=LitellmModel(\n model=f\"azure/{AZURE_DEPLOYMENT_NAME}\",\n api_key=AZURE_API_KEY,\n api_base=AZURE_API_BASE,\n api_version=AZURE_API_VERSION,\n ),\n tools=[get_weather],\n )\n \n # Run the agent\n result = await Runner.run(agent, \"What's the weather in Tokyo?\")\n print(result.final_output)\n```", "4. Add the main function to `examples/model_providers/azure_example.py` to run both examples:", "```python\nasync def main():\n # Check if Azure configuration is set\n if not AZURE_API_KEY or not AZURE_API_BASE or not AZURE_DEPLOYMENT_NAME:\n print(\"Please set AZURE_API_KEY, AZURE_API_BASE, and AZURE_DEPLOYMENT_NAME environment variables.\")\n print(\"Example: export AZURE_API_KEY=your-key\")\n print(\"Example: export AZURE_API_BASE=https://your-resource.openai.azure.com\")\n print(\"Example: export AZURE_DEPLOYMENT_NAME=your-deployment-name\")\n return\n \n # Run both examples\n await azure_openai_client_example()\n await azure_litellm_example()\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n```", "5. Update the documentation in `docs/models/index.md` to add information about Azure OpenAI support:", "```markdown\n## Azure OpenAI support\n\nThe Agents SDK supports Azure OpenAI in two ways:\n\n### 1. Using AsyncOpenAI client with Azure configuration\n\nYou can configure the SDK to use Azure OpenAI by setting up a custom AsyncOpenAI client with Azure configuration:\n\n```python\nfrom openai import AsyncOpenAI\nfrom agents import set_default_openai_client, set_default_openai_api\n\n# Create Azure OpenAI client\nazure_client = AsyncOpenAI(\n api_key=\"your-azure-api-key\",\n base_url=\"https://your-resource.openai.azure.com/openai/deployments/your-deployment-name\",\n api_version=\"2023-05-15\",\n)\n\n# Set as default client\nset_default_openai_client(client=azure_client)\n\n# Most Azure deployments use Chat Completions API\nset_default_openai_api(\"chat_completions\")\n```\n\n### 2. Using LiteLLM integration with Azure models\n\nYou can also use the LiteLLM integration to access Azure OpenAI models:\n\n```python\nfrom agents import Agent\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_key=\"your-azure-api-key\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n ),\n # ... other agent parameters\n)\n```\n\nSee the [full example](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers/azure_example.py) for more details.\n```", "6. Create a new documentation file `docs/models/azure.md` with detailed Azure OpenAI integration instructions:", "```markdown\n# Azure OpenAI Integration\n\nThis guide explains how to use Azure OpenAI with the Agents SDK.\n\n## Prerequisites\n\n- An Azure account with access to Azure OpenAI Service\n- An Azure OpenAI deployment\n- The following information:\n - Azure API Key\n - Azure API Base URL\n - Azure Deployment Name\n - Azure API Version (default: \"2023-05-15\")\n\n## Integration Methods\n\nThere are two main ways to use Azure OpenAI with the Agents SDK:\n\n### Method 1: Using AsyncOpenAI client with Azure configuration\n\nThis method configures the SDK to use Azure OpenAI globally by setting up a custom AsyncOpenAI client:\n\n```python\nfrom openai import AsyncOpenAI\nfrom agents import set_default_openai_client, set_default_openai_api\n\n# Create Azure OpenAI client\nazure_client = AsyncOpenAI(\n api_key=\"your-azure-api-key\",\n base_url=\"https://your-resource.openai.azure.com/openai/deployments/your-deployment-name\",\n api_version=\"2023-05-15\",\n)\n\n# Set as default client\nset_default_openai_client(client=azure_client)\n\n# Most Azure deployments use Chat Completions API\nset_default_openai_api(\"chat_completions\")\n```\n\n### Method 2: Using LiteLLM integration with Azure models\n\nThis method uses the LiteLLM integration to access Azure OpenAI models on a per-agent basis:\n\n```python\nfrom agents import Agent\nfrom agents.extensions.models.litellm_model import LitellmModel\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_key=\"your-azure-api-key\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n ),\n # ... other agent parameters\n)\n```\n\n## Authentication Options\n\n### API Key Authentication\n\nThe examples above use API key authentication, which is the simplest method.\n\n### Microsoft Entra ID (formerly Azure AD) Authentication\n\nFor Entra ID authentication, you can use the Azure Identity library with LiteLLM:\n\n```python\nfrom azure.identity import DefaultAzureCredential\nfrom agents.extensions.models.litellm_model import LitellmModel\n\n# Get token using DefaultAzureCredential\ndefault_credential = DefaultAzureCredential()\ntoken = default_credential.get_token(\"https://cognitiveservices.azure.com/.default\")\n\nagent = Agent(\n model=LitellmModel(\n model=\"azure/your-deployment-name\",\n api_base=\"https://your-resource.openai.azure.com\",\n api_version=\"2023-05-15\",\n azure_ad_token=token.token,\n ),\n # ... other agent parameters\n)\n```\n\n## Full Example\n\nSee the [azure_example.py](https://github.com/openai/openai-agents-python/tree/main/examples/model_providers/azure_example.py) for a complete working example.\n```", "7. Update `mkdocs.yml` to include the new Azure documentation page:", "```yaml\n# Add the following under the 'Models' section in the nav:\n - models/azure.md\n```", "8. Run formatting and linting on the new files:", "```bash\nmake format\nmake lint\n```" ]Metadata
Metadata
Assignees
Labels
No labels