Skip to content

No OpenAI tracing with Azure Foundry models #1051

Open
@lamaeldo

Description

@lamaeldo

Describe the bug

My agent uses 4.1 models, served on Azure Foundry. The default openAI Tracing is somewhat broken. The traces are created and timed correctly, as are their spans. However, any spans that corresponds to an LLM call is broken: selecting them shows no content :

Image

Debug information

  • Agents SDK version: 0.1.0 (latest)
  • Python version: reproduced across 3.10 to 3.13
  • OpenAI: 1.90.0

Repro steps

from agents import OpenAIResponsesModel

#the endpoint looks like this "https://XXX.openai.azure.com/"
client = AsyncOpenAI(
    api_key=settings.main_key,
    base_url=str(settings.endpoint+"/openai/v1/"),
    default_query={
        "api-version": "preview"
    }
)
model = OpenAIResponsesModel(
    model=settings.model_name,
    openai_client=client

os.environ["OPENAI_API_KEY"] = "sk-..."

with trace(workflow_name="Testrun - " + idparameter):
    result = await Runner.run(agent, conversation_items)

During the run, each agent LLM call leads to a
INFO:httpx:HTTP Request: POST https://api.openai.com/v1/traces/ingest "HTTP/1.1 204 No Content"

I assume this means that the trace upload calls are slightly off when using Azure Foundry served OpenAI responses models

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions