Skip to content

ollama APIConnectionError #120

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
a7mad911 opened this issue Mar 13, 2025 · 3 comments
Closed

ollama APIConnectionError #120

a7mad911 opened this issue Mar 13, 2025 · 3 comments
Labels
question Question about using the SDK

Comments

@a7mad911
Copy link

a7mad911 commented Mar 13, 2025

error when using ollama

from agents import Agent, Runner, AsyncOpenAI, ModelSettings, OpenAIChatCompletionsModel, set_default_openai_client
from openai import AsyncOpenAI
external_client = AsyncOpenAI(
api_key="",
base_url="http://localhost:11434/v1",
)

set_default_openai_client(external_client)

spanish_agent = Agent(
name="Spanish agent",
instructions="You only speak English.",
model=OpenAIChatCompletionsModel(
model="llama3.1",
openai_client=external_client,
),
model_settings=ModelSettings(temperature=0.5),
)
res = Runner.run_sync(
starting_agent=spanish_agent,
input="hi"
)

print(res.final_output)

error:
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
OPENAI_API_KEY is not set, skipping trace export

@a7mad911 a7mad911 added the question Question about using the SDK label Mar 13, 2025
@LeonG7
Copy link

LeonG7 commented Mar 13, 2025

#44

@keldenl
Copy link

keldenl commented Mar 13, 2025

I tried #44 and it didn't work out of the box. Had to do a mix of what was on that thread and https://openai.github.io/openai-agents-python/models/ but it works with ollama!

This is an example with the starter code:

from openai import AsyncOpenAI
from agents import OpenAIChatCompletionsModel,Agent,Runner
from agents.model_settings import ModelSettings
from agents import set_default_openai_client, set_tracing_disabled

external_client = AsyncOpenAI(
    base_url = 'http://localhost:11434/v1',
    api_key='ollama', # required, but unused
)
set_default_openai_client(external_client)
set_tracing_disabled(True)


agent = Agent(
    name="Assistant", 
    instructions="You are a helpful assistant",
    model=OpenAIChatCompletionsModel(
        model="gemma3:12b",
        openai_client=external_client,
    )
    )

result = Runner.run_sync(agent, "Write a haiku about recursion in programming.")
print(result.final_output)

# Code within the code,
# Functions calling themselves,
# Infinite loop's dance.

@keldenl
Copy link

keldenl commented Mar 13, 2025

Another issue I ran into was that models that DO NOT support tool calling will NOT work with the handoff example. Any ollama models that support tool calling (https://ollama.com/search?c=tools), will work.

Here's a working example with qwen 2.5 with the example agent triage handoff:

from openai import AsyncOpenAI
from agents import OpenAIChatCompletionsModel,Agent,Runner
from agents.model_settings import ModelSettings
from agents import set_default_openai_client, set_tracing_disabled

external_client = AsyncOpenAI(
    base_url = 'http://localhost:11434/v1',
    api_key='ollama', # required, but unused
)
set_default_openai_client(external_client)
set_tracing_disabled(True)


from agents import Agent, Runner
import asyncio

model = OpenAIChatCompletionsModel(
        model="qwen2.5:14b-instruct-q5_K_M",
        openai_client=external_client,
    )

spanish_agent = Agent(
    name="Spanish agent",
    instructions="You only speak Spanish.",
    model=model
)

english_agent = Agent(
    name="English agent",
    instructions="You only speak English",
    model=model
)

triage_agent = Agent(
    name="Triage agent",
    instructions="Handoff to the appropriate agent based on the language of the request.",
    handoffs=[spanish_agent, english_agent],
    model=model
)


async def main():
    result = await Runner.run(triage_agent, input="Hola, ¿cómo estás?")
    print(result.final_output)
    # ¡Hola! Estoy bien, gracias por preguntar. ¿Y tú, cómo estás?


if __name__ == "__main__":
    asyncio.run(main())

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question about using the SDK
Projects
None yet
Development

No branches or pull requests

3 participants