Skip to content

Commit dd52a1a

Browse files
committed
[claude-code] Add top-level module doc for torch/distributed/tensor/_op_schema.py
Not sure how good the description is, seeking insight from maintainers. Signed-off-by: Edward Yang <ezyang@meta.com> ghstack-source-id: b6998b1 Pull-Request: #157804
1 parent eda0a9c commit dd52a1a

File tree

1 file changed

+23
-0
lines changed

1 file changed

+23
-0
lines changed

torch/distributed/tensor/_op_schema.py

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,27 @@
11
# mypy: allow-untyped-defs
2+
"""
3+
DTensor operator schema definitions and utilities.
4+
5+
This module defines the core data structures and utilities for describing and managing
6+
distributed tensor operations in PyTorch's DTensor system. It provides the foundational
7+
schema types used for sharding propagation, operator strategy selection, and distributed
8+
execution planning.
9+
10+
Key components:
11+
- OpSpec: Describes acceptable sharding placements for operations
12+
- OpStrategy: Represents the possible sharding strategies for an operator
13+
- TupleStrategy: Container for multiple strategies when ops have tuple/list of tensors input
14+
- OpSchema: Describes operator input/output schemas with DTensorSpecs
15+
- OutputSharding: Manages output sharding specifications and redistribution
16+
- RuntimeSchemaInfo: Runtime execution metadata for operators
17+
- OpInfo: Complete runtime operator execution information
18+
19+
These schema definitions enable the DTensor system to:
20+
1. Propagate tensor sharding information to the operator outputs
21+
2. Greedily select sharding strategies for distributed operations
22+
3. Plan and execute tensor redistributions when needed
23+
4. Cache sharding decisions for performance optimization
24+
"""
225
from collections.abc import Sequence
326
from dataclasses import dataclass
427
from functools import cached_property

0 commit comments

Comments
 (0)