Revert "Revert "Python Bindings for SymInts (#78135)""#79608
Revert "Revert "Python Bindings for SymInts (#78135)""#79608ezyang wants to merge 2 commits intogh/ezyang/1222/basefrom
Conversation
This reverts commit b8db0a0. [ghstack-poisoned]
🔗 Helpful links
❌ 17 New FailuresAs of commit 06ebcd9 (more details on the Dr. CI page): Expand to see more
🕵️ 17 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages
|
|
Here is the diff from Nick's OG PR: |
This reverts commit b8db0a0. [ghstack-poisoned]
| def test_fx_trace_intlist(self): | ||
| class CustomModule(torch.nn.Module): | ||
| def forward(self, x): | ||
| bs, c, h, w = x.shape | ||
| return F.pad(x, (0, w % 2, 0, h % 2, 0, 0)) | ||
|
|
||
| m = CustomModule() | ||
| x = torch.rand(1, 3, 4, 4) | ||
| # should not TypeError: pad(): argument 'pad' (position 2) must be | ||
| # tuple of ints, not tuple | ||
| torch.fx.symbolic_trace(m) |
There was a problem hiding this comment.
Technically, we do not need to create a custom module to repro the issue, just a function:
def foo(x):
bs, c, h, w = x.shape
return F.pad(x, (0, w % 2, 0, h % 2, 0, 0))
torch.fx.symbolic_trace(foo)There was a problem hiding this comment.
Agreed, or at least remove the unused x variable.
This reverts commit b8db0a0. ghstack-source-id: 602ffd6 Pull Request resolved: pytorch#79608
Stack from ghstack (oldest at bottom):
This reverts commit b8db0a0.