Skip to content

Conversation

viniciusdsmello
Copy link
Contributor

Pull Request

Summary

This PR enables users to set custom inference IDs when using the @trace decorator, addressing the need for request correlation and integration with external systems. Previously, inference IDs were always auto-generated UUIDs, making it difficult to correlate traces with external request identifiers.

Changes

  • Add dedicated inference_id field to Trace class for cleaner architecture
  • Enable custom inference ID via update_current_trace(inferenceId="custom_id")
  • Maintain backward compatibility with auto-generated UUID fallback
  • Simplify post_process_trace logic by removing special metadata filtering
  • Update examples to demonstrate custom inference ID usage patterns
  • Add comprehensive test coverage for both custom and auto-generated IDs

Context

Users need the ability to set custom inference IDs for several critical use cases:

  • Request Correlation: Link traces with external API request IDs
  • Batch Processing: Group related requests with correlation patterns
  • System Integration: Use existing request identifiers from upstream systems
  • Audit Trails: Maintain consistent identifiers across distributed systems

The previous implementation stored inferenceId in trace metadata, which was semantically incorrect and required complex filtering logic. This PR refactors the architecture to use a dedicated field while maintaining full backward compatibility.

Closes: OPEN-7312

Testing

  • Unit tests: Created comprehensive tests for custom and auto-generated inference IDs
  • Manual testing: Verified functionality with live Openlayer API using provided credentials
  • Integration testing: Tested request correlation patterns and batch processing scenarios
  • Backward compatibility: Verified existing code continues to work unchanged

Usage Examples

Basic Custom Inference ID

@trace()
def process_request(user_id: str):
    custom_id = f"req_{user_id}_{int(time.time())}"
    update_current_trace(inferenceId=custom_id)
    return "processed"

Request Correlation

@trace()
def batch_process():
    batch_id = f"batch_{uuid.uuid4().hex[:8]}"
    update_current_trace(inferenceId=batch_id)
    # Process related items...

External System Integration

@trace()
def handle_api_request(external_request_id: str):
    update_current_trace(inferenceId=external_request_id)
    # Use existing request ID from external system

Breaking Changes

None - This change is fully backward compatible. Existing code will continue to work exactly as before, using auto-generated UUIDs when no custom inference ID is provided.

- Add dedicated inference_id field to Trace class instead of storing in metadata
- Allow users to set custom inference ID via update_current_trace(inferenceId=...)
- Maintain backward compatibility with auto-generated UUID fallback
- Simplify post_process_trace logic by removing special metadata filtering
- Update examples to demonstrate custom inference ID usage patterns

BREAKING CHANGE: None - fully backward compatible

Closes: #OPEN-7312
@viniciusdsmello viniciusdsmello self-assigned this Sep 9, 2025
@whoseoyster whoseoyster merged commit 1f01ca5 into main Sep 9, 2025
5 checks passed
@whoseoyster whoseoyster deleted the vini/open-7312-tracing-enable-custom-inferenceid-when-using-trace-decorator branch September 9, 2025 04:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants