Skip to content

feat: inference api support return with resolved message #2913

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

wangfenjin
Copy link

@wangfenjin wangfenjin commented Jul 24, 2025

Please be noted that most code are generated by LLM, I do review them and reject some unexpected edits.

Related to #2896


Important

Add support for returning resolved messages in inference API to maintain context in multi-turn conversations.

  • Behavior:
    • Adds include_resolved_messages flag to Params in inference.rs to include resolved messages in the response.
    • Updates inference() in inference.rs to extract resolved messages if include_resolved_messages is true.
    • InferenceResponse::new() in inference.rs now accepts resolved_messages.
  • Tests:
    • Adds tests for inference_response_with_resolved_messages and include_resolved_messages_integration in inference.rs.
  • Compatibility:
    • include_resolved_messages set to false in openai_compatible.rs as OpenAI endpoint does not support it.
  • Misc:
    • Updates client_inference_params.rs and batch_inference.rs to handle include_resolved_messages flag, defaulting to false.

This description was created by Ellipsis for 7f91b09. You can customize this summary. It will automatically update as commits are pushed.

Copy link
Contributor

github-actions bot commented Jul 24, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@wangfenjin
Copy link
Author

I have read the Contributor License Agreement (CLA) and hereby sign the CLA.

github-actions bot added a commit that referenced this pull request Jul 24, 2025
@wangfenjin wangfenjin changed the title feat: inference api support return with resolved message for client to buil… feat: inference api support return with resolved message Jul 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant