-
Notifications
You must be signed in to change notification settings - Fork 27.5k
a.new_tensor([num]) should not introduce a graph break #176067
Copy link
Copy link
Closed
Labels
PT2-Bug-BashActionable issues for PT2-Bug-BashActionable issues for PT2-Bug-Bashbot-triagedThis is a label only to be used by the auto triage botThis is a label only to be used by the auto triage botmodule: dynamic shapesmodule: dynamooncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Metadata
Metadata
Assignees
Labels
PT2-Bug-BashActionable issues for PT2-Bug-BashActionable issues for PT2-Bug-Bashbot-triagedThis is a label only to be used by the auto triage botThis is a label only to be used by the auto triage botmodule: dynamic shapesmodule: dynamooncall: pt2triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
🐛 Describe the bug
For this simple case, f2 has a break while f1 doesn't:
Error logs
versions.log
Versions
To my understanding, if
torch.tensor([num])could be compiled without break,a.new_tensor([num])should also pass?cc @chauhang @penguinwu @ezyang @bobrenjc93 @aditvenk @laithsakka @voznesenskym @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @wenzhe-nrv @jiayisunx @kadeng @amjames @Lucaskabela @jataylo