Skip to content

Conversation

seratch
Copy link
Member

@seratch seratch commented Aug 26, 2025

This pull request resolves #1564

@seratch seratch requested a review from rm-openai August 26, 2025 12:16
@seratch seratch added enhancement New feature or request feature:core labels Aug 26, 2025
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codex Review: Here are some suggestions.

[P1] Conversation IDs are never surfaced to callers

[openai_responses.py:140-142] The OpenAIResponsesModel now accepts a conversation_id and passes it to _client.responses.create, but the response’s generated conversation_id is discarded when wrapping the Response object in ModelResponse (lines 140‑143). Only the response_id is returned. When a new conversation is created, callers have no way to retrieve the generated conversation ID to supply on later turns, which makes stored conversations unusable from the SDK. The model should capture response.conversation_id (or equivalent) and return it through ModelResponse/RunResult so that a conversation can be resumed.

Reply with @codex fix comments to fix any unresolved comments.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

Copy link
Collaborator

@rm-openai rm-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This generally looks good to me. A couple of quick questions in case you've thought about them:

  1. It's possible that the user doesn't want every LLM call or every agent to read from or write to the conversation. e.g. imagine the setup is classify the query -> route to an agent. Maybe you only want the 2nd one to be reading/writing from conversation. How would that be enabled?
  2. This is somewhat tangential but curious if you think the Sessions concept should have a conversations based wrapped?

@seratch
Copy link
Member Author

seratch commented Aug 27, 2025

It's possible that the user doesn't want every LLM call or every agent to read from or write to the conversation. e.g. imagine the setup is classify the query -> route to an agent. Maybe you only want the 2nd one to be reading/writing from conversation. How would that be enabled?

If these two runs are separate (i.e., deterministic workflow that just sequentially runs two agents), that's simple. If you mean handoffs, indeed it's not feasible with this feature addition. That's the same for session as well. I am not sure if attaching conversation_id when instantiating an Agent object is a right solution here. Probably not. Then, I don't have any other ideas.

I think the immediate suggestion from us would be you can use conversation_id, previous_response_id, and session per an agent run.

This is somewhat tangential but curious if you think the Sessions concept should have a conversations based wrapped?

Conversations API based sessions implementation is worth considering, and it should be feasible. If a user is fine with frequently calling the API endpoints, it should be an good option. I can explore the implementation tomorrow or later this week.

Copy link
Collaborator

@rm-openai rm-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

small comment but otherwise this looks fine to ship as is

@seratch seratch force-pushed the issue-1564-conversation branch from 4760aff to 977843c Compare August 28, 2025 04:10
@@ -102,268 +97,3 @@ async def pop_item(self) -> TResponseInputItem | None:
async def clear_session(self) -> None:
"""Clear all items for this session."""
...


class SQLiteSession(SessionABC):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved this to sqlite_session.py

class OpenAISession(SessionABC):
def __init__(
self,
session_id: str | None = None,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use conversation_id as session_id; you can pass the id if you already have, but if you don't, this class creates a new one under the hood

Copy link
Member Author

@seratch seratch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comments for reviwers

@@ -20,28 +20,56 @@ async def main():
create_tables=True,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved to memory dir and made the code consistent with others

openai_client: Optional[AsyncOpenAI] = None,
):
# this implementation allows to set this value later
self.session_id = session_id # type: ignore
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using conversation_id as session_id; you can pass if you already have; otherwise, this class generates a new one under the hood

from .session import SessionABC


class SQLiteSession(SessionABC):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

moved from session.py

Copy link
Collaborator

@rm-openai rm-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this generally looks good but feels like there are too many type:ignores here

return response.id


class OpenAISession(SessionABC):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
class OpenAISession(SessionABC):
class OpenAIConversationsSession(SessionABC):

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

renamed

class OpenAISession(SessionABC):
def __init__(
self,
session_id: Optional[str] = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
session_id: Optional[str] = None,
conversation_id: str | None = None,

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

somehow, this fails with python 3.9 tests; I had to use optional in this file

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't know what's the trigger of the A | B error with python 3.9 but for this one, let me go with Optional[A]. When we drop python 3.9 support, we can make it consistent

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can just add from __future__ import annotations to the top of the file for these

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah I see. will do

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, this is done ✅

def __init__(
self,
session_id: Optional[str] = None,
openai_client: Optional[AsyncOpenAI] = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
openai_client: Optional[AsyncOpenAI] = None,
openai_client: AsyncOpenAI | None = None,

Also feels kinda weird to ask for a client here when in most places we don't require this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

having to create an instance isn't not great when you're using default openai client

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

forgot to mention other patterns; if you want to pass a client for a certain reason (say, you manage multiple instances for multi tenant apps?), accepting this should be helpful for those.

@seratch
Copy link
Member Author

seratch commented Aug 28, 2025

feels like there are too many type:ignores here

yeah, but it was inevitable to call model_dump() to make the returned objects to be serializable

@seratch seratch force-pushed the issue-1564-conversation branch from 1bce3f4 to c531e96 Compare August 29, 2025 00:16
@seratch
Copy link
Member Author

seratch commented Aug 29, 2025

@rm-openai renamed the class and resolved most of the type ignores. let me know if you see anything else to change; otherwise, you can merge at any time

@seratch seratch mentioned this pull request Aug 29, 2025
@seratch seratch force-pushed the issue-1564-conversation branch from 13c9fbd to d9cd39b Compare August 29, 2025 01:37
def __init__(
self,
*,
session_id: str | None = None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you rename to conversation_id please?

openai_client: AsyncOpenAI | None = None,
):
# this implementation allows to set this value later
self.session_id = session_id or _EMPTY_SESSION_ID
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

a better pattern is:

self._session_id: str | None

with

async def _get_session_id(self) -> str:
  if not self._session_id:
    self._session_id = await ...
  return self._session_id

if _openai_client is None:
_openai_client = get_default_openai_client() or AsyncOpenAI()
# this never be None here
self.openai_client: AsyncOpenAI = _openai_client
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

generally prefix with _

Comment on lines 154 to 155
previous_response_id: str | None = None, # unused
conversation_id: str | None = None, # unused
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems used to me? line 170/171

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, ai editor did extra...

Copy link
Member Author

@seratch seratch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

all the comments should be resolved now; if you see anything extra tomorrow, please feel free to modify this branch before merging

Comment on lines 154 to 155
previous_response_id: str | None = None, # unused
conversation_id: str | None = None, # unused
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, ai editor did extra...

@rm-openai rm-openai merged commit 164acb5 into main Aug 29, 2025
5 checks passed
@rm-openai rm-openai deleted the issue-1564-conversation branch August 29, 2025 14:06
@surfingdev
Copy link

This is exactly what I was looking for. I see this is already out but there is no documentation for it right? cc @seratch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request feature:core
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add conversations API support
3 participants