Support of implicit broadcasting with unbounded dynamism#6219
Merged
Conversation
GleasonK
requested changes
Dec 20, 2023
Collaborator
GleasonK
left a comment
There was a problem hiding this comment.
This is a great writeup on broadcasting rationale and the state of broadcasting in CHLO. Thanks for the thorough commit message.
I'm trying to figure out how to keep these methods as simple and maintainable as possible, left a few comments to that end.
GleasonK
reviewed
Dec 21, 2023
GleasonK
reviewed
Dec 21, 2023
GleasonK
reviewed
Dec 21, 2023
GleasonK
reviewed
Dec 21, 2023
GleasonK
reviewed
Dec 21, 2023
1f65892 to
4b6bc0d
Compare
GleasonK
approved these changes
Dec 21, 2023
Collaborator
GleasonK
left a comment
There was a problem hiding this comment.
LGTM. One open comment, we can resolve there. Otherwise LGTM
aac915f to
7627e33
Compare
7627e33 to
1fc1370
Compare
1fc1370 to
4de4102
Compare
GleasonK
approved these changes
Dec 22, 2023
ghpvnist
reviewed
Jan 2, 2024
14463bb to
284cb0b
Compare
lsy323
approved these changes
Jan 10, 2024
Collaborator
|
LGTM, thanks! Let's rebase after the CI is green again on HEAD. |
284cb0b to
2def4f1
Compare
bhavya01
pushed a commit
that referenced
this pull request
Apr 22, 2024
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Background
State of implicit broadcasting support in XLA and PyTorch/XLA codebases.
Recently, HLO is equipped with the capability to express unbounded dynamic shaped ref. The Pytorch/XLA bridge added the machinery to propagate unbounded dynamic dimensions from torch to XLA.
HLO, in its current form, can handle implicit broadcasting for static shapes and bounded dynamic shapes which is currently leveraged by the PyTorch/XLA bridge as a single source of truth.
However, there is no support in XLA/HLO for implicit broadcasting with unbounded dynamic shapes.
Relevance of shape assertion to support implicit broadcasting with unbounded dynamic shapes
With static and bounded dynamic shapes it is feasible to check at compile time if the broadcasting rules are met. With unbounded dynamic shapes, we need to rely on runtime guards (which we refer as shape assertions) to ensure the participating shapes in broadcasting are valid. With that said, typical code generation for implicit broadcasting with unbounded shapes consists of two parts: (A) shape assertions, and (B) broadcasting sequence, which actuallly does the broadcasting assuming all the shape assertions hold good.
For example, CHLO dialect supports implicit broadcasting with unbounded dynamic shape ref. The support relies on shape dialect ops with shape assertions embedded in the code to check broadcasting rules are met. Please refer to the Appendix for how the lowered mhlo code (chlo ops → shape dialect ops → mhlo ops) would look like. Also, note how the shape assertions and broadcasting sequence look like.
Similarly, Jax supports experimental lowering of polymorphic shape specification to StableHLO with shape assertions to validates the the specification is valid at runtime.
PyTorch symbolic shape specification
Per ref and ref, PyTorch allows constraints over the dynamic dimensions. The shape constraints are currently represented in the FX graph and can be converted to assertions.
Proposal
Support implicit broadcasting with unbounded dynamic shapes at PT/XLA level.
With the shape constraint specification provided at the framework (PyTorch) level, it would make sense for PyTorch/XLA to leverage that information while doing implicit broadcasting.
Another option was to support implicit broadcasting at the XLA level. This is discouraged because HLO does not have any notion of shape constraint specification, hence the support would require to propagate those information via PyTorch/XLA bridge. Any change in specification format/semantics would require changes at the HLO client APIs.
Current PR
In the current PR, we are just implementing the just broadcast sequence (refer to B above) assuming that the participating shapes met the broadcasting rules at runtime. There is tracking issue #6232 to make sure that the shape constraints, provided in terms of PyTorch shape specification, in the FX graph are converted to shape assertions.
Example
With the proposed change, the following mini-model
can be exported to following StableHLO code
Appendix
Consider the following legalization of
chlo.broadcast_addto MHLO ops via Shape dialects.