[opinfo] nn.functional.unfold#62705
Conversation
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit ea61279 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
krshrimali
left a comment
There was a problem hiding this comment.
This looks great, thanks @kshitij12345! I added a comment on how we can a few more cases for completeness.
It's interesting though, that we have an OpInfo for unfold already (wasn't aware of this) but I'm not sure we need to keep it (or if it's any different) since it doesn't cover dilation, padding, stride args:
pytorch/torch/testing/_internal/common_methods_invocations.py
Lines 7687 to 7694 in 773a8ee
Thanks!
|
torch.unfold is actually different from pytorch/torch/nn/functional.py Line 4481 in 773a8ee |
krshrimali
left a comment
There was a problem hiding this comment.
LGTM, thanks @kshitij12345!
zou3519
left a comment
There was a problem hiding this comment.
One really minor comment about the inputs shape. But I'm also happy to merge this as-is and add the shape in a follow-up, please let me know how you'd like to proceed
|
@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
|
I wonder if we should partition OpInfos across multiple files to avoid merge conflicts. Right now I can only really merge one of these PRs a day (two sometimes if I'm lucky) because I need to rebase internally, ship it, and then wait for the change to reach viable/strict. |
|
This is being pursued in #59871 (Though targeting a particular kind of OpInfo like nn.functional will still lead to conflicts) |
Reference: pytorch/functorch#78