feat(tracing): enable custom inferenceId for trace decorator #527
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Pull Request
Summary
This PR enables users to set custom inference IDs when using the
@trace
decorator, addressing the need for request correlation and integration with external systems. Previously, inference IDs were always auto-generated UUIDs, making it difficult to correlate traces with external request identifiers.Changes
inference_id
field toTrace
class for cleaner architectureupdate_current_trace(inferenceId="custom_id")
post_process_trace
logic by removing special metadata filteringContext
Users need the ability to set custom inference IDs for several critical use cases:
The previous implementation stored
inferenceId
in trace metadata, which was semantically incorrect and required complex filtering logic. This PR refactors the architecture to use a dedicated field while maintaining full backward compatibility.Closes: OPEN-7312
Testing
Usage Examples
Basic Custom Inference ID
Request Correlation
External System Integration
Breaking Changes
None - This change is fully backward compatible. Existing code will continue to work exactly as before, using auto-generated UUIDs when no custom inference ID is provided.