Skip to content

xAI Grok 4 Usage #1056

Open
Open
@ruidazeng

Description

@ruidazeng

Please read this first

  • Have you read the custom model provider docs, including the 'Common issues' section? Yes – reviewed the model provider docs.
  • Have you searched for related issues? Yes – could not find a match.

Describe the question

I’d like to call xAI Grok-4 via LiteLLM’s xAI provider inside the OpenAI Agents SDK. I can’t find an example or instructions on how to register a LiteLLM backend so that OpenAI (or a custom client) in the Agents SDK uses Grok-4.


Debug information

  • Agents SDK version: v0.1.0
  • Python version: 3.10.13
  • LiteLLM version: v1.37.0
  • Environment: macOS 14.4 (Apple Silicon)

Repro steps

import os
from openai import OpenAI            # Agents SDK import
import litellm

# 1️⃣  Configure LiteLLM to talk to xAI Grok-4
os.environ["LITELLM_XAI_API_KEY"] = os.getenv("XAI_API_KEY")
litellm.llm_provider_aliases["grok-4"] = "xai/grok-4"

# 2️⃣  Attempt to register LiteLLM as a custom transport for the Agents SDK
client = OpenAI(                     # ← not sure how to plug LiteLLM in here
    api_key="dummy",                 # (ignored by LiteLLM)
    base_url=None,                   # ?
)

response = client.chat.completions.create(
    model="xai/grok-4",
    messages=[{"role": "user", "content": "Hello from Agents SDK"}]
)
print(response)

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions