[DTensor] Add InputDim.__eq__ type guard to prevent int comparison bugs#178599
[DTensor] Add InputDim.__eq__ type guard to prevent int comparison bugs#178599stmcgovern wants to merge 1 commit intopytorch:mainfrom
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/178599
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 639a836 with merge base d3c36f0 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison silently returns False, making downstream logic dead code. The original instance of this bug (shard.dim == in_dim) survived over 3 years in propagate_shape_and_sharding before being fixed(pytorch#177973). This change adds a structural guard: InputDim.__eq__ raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching __hash__ (salted with the class to avoid collisions with raw ints in dicts/sets) preserves the hash contract. This prevents the bug class from recurring.
e1d587b to
639a836
Compare
wconstab
left a comment
There was a problem hiding this comment.
lgtm! let me ask claude to review just in case
|
@claude review this PR |
|
Claude finished @wconstab's task in 3m 41s —— View job Review of PR #178599
Overall: This PR looks good. The change is a targeted defensive guard against a real bug class that went undetected for years. I reviewed all usages of Implementation (
|
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…gs (pytorch#178599) Follow on for pytorch#177972 . InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison always returns False, making downstream logic dead code. The original instance of this bug (shard.dim == in_dim) in propagate_shape_and_sharding was fixed in pytorch#177973. This change adds a structural guard: InputDim.__eq__ raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching __hash__ preserves the hash contract. This prevents the bug class from recurring. @wconstab Pull Request resolved: pytorch#178599 Approved by: https://github.com/wconstab
…gs (pytorch#178599) Follow on for pytorch#177972 . InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison always returns False, making downstream logic dead code. The original instance of this bug (shard.dim == in_dim) in propagate_shape_and_sharding was fixed in pytorch#177973. This change adds a structural guard: InputDim.__eq__ raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching __hash__ preserves the hash contract. This prevents the bug class from recurring. @wconstab Pull Request resolved: pytorch#178599 Approved by: https://github.com/wconstab
…gs (pytorch#178599) Follow on for pytorch#177972 . InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison always returns False, making downstream logic dead code. The original instance of this bug (shard.dim == in_dim) in propagate_shape_and_sharding was fixed in pytorch#177973. This change adds a structural guard: InputDim.__eq__ raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching __hash__ preserves the hash contract. This prevents the bug class from recurring. @wconstab Pull Request resolved: pytorch#178599 Approved by: https://github.com/wconstab
…gs (pytorch#178599) Follow on for pytorch#177972 . InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison always returns False, making downstream logic dead code. The original instance of this bug (shard.dim == in_dim) in propagate_shape_and_sharding was fixed in pytorch#177973. This change adds a structural guard: InputDim.__eq__ raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching __hash__ preserves the hash contract. This prevents the bug class from recurring. @wconstab Pull Request resolved: pytorch#178599 Approved by: https://github.com/wconstab
Follow on for #177972 .
InputDim is a dataclass wrapping an int (input_dim), which makes it easy to accidentally compare shard.dim (an int) with an InputDim instance instead of InputDim.input_dim. This comparison always
returns False, making downstream logic dead code.
The original instance of this bug (shard.dim == in_dim) in propagate_shape_and_sharding was fixed in #177973. This change adds a structural guard: InputDim.eq raises TypeError when compared with non-DimSpec types (like int), directing the developer to use .input_dim instead. A matching hash preserves the hash contract. This prevents the bug class from recurring. @wconstab