An implementation of torch.tile as requested in pytorch/pytorch#38349#47974
An implementation of torch.tile as requested in pytorch/pytorch#38349#47974arbfay wants to merge 11 commits intopytorch:masterfrom
Conversation
|
Hi @arbfay! Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
💊 CI failures summary and remediationsAs of commit e253f12 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 68 times. |
|
cc @mruberry for numpy compatibility - could you take a look at this please? |
mruberry
left a comment
There was a problem hiding this comment.
Hi @arbfay, thanks for this contribution!
Overall this PR looks pretty good. I'm not totally sure why you're getting the error you are. It may be as simple as calling repeat instead of at::repeat. You might have to look at the intermediate values and debug the issue, thought.
I've left some comments that are mostly style/docs suggestions. The tests also need to be more thorough to account for more possible cases.
Looking forward to reviewing the follow-up!
|
Hi @mruberry, Thanks for this review! I expected lots of comments for my first PR to PyTorch, and I got served 😄 Waiting now for your next review |
|
This failure on ROCm is real: You can always use the pattern Would you correct this test and I'll take a look after we get an updated test signal? |
|
@mruberry I think it's all ok with the tests (the one failing is because the pipeline is broken I think) |
mruberry
left a comment
There was a problem hiding this comment.
Hey @arbfay!
Just a couple minor implementation issues and a small doc extension + test extension needed. Otherwise this PR looks good.
For the tests, this is currently missing autograd tests. That's understandable since we don't do a good job explaining how to add them.
You can automatically generate autograd tests by adding an entry to
#46041 recently did this, too, so that might be a useful reference. Let me know if you have any questions.
|
Thanks @mruberry ! I've been struggling a bit with the autograd tests. I've added more doc as asked, and used |
|
I hope you don't mind but I tweaked the docs slight and tried to fix the common_methods() tests. If my fix for the tests didn't work properly then we can remove them for now to unblock this PR. Let's just wait for the CI to run. |
…ch.Size call in common_methods_invocations
|
Thanks for your commits @mruberry, they are more than welcome! I've tried with your modification in common_methods and only one test is still a problem (no_reps_dims, I pushed a commit to remove it). This is great, I've been struggling to understand what was causing those failed assertions. Let's wait for the pipelines. |
Codecov Report
@@ Coverage Diff @@
## master #47974 +/- ##
==========================================
- Coverage 81.15% 81.14% -0.01%
==========================================
Files 1838 1838
Lines 198653 198663 +10
==========================================
- Hits 161214 161211 -3
- Misses 37439 37452 +13 |
|
@mruberry is the PR good to go? |
facebook-github-bot
left a comment
There was a problem hiding this comment.
@mruberry has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
|
Congrats @arbfay! Thanks for implementing torch.tile! |
|
Thanks @mruberry, and thanks for the help! Been a nice ride, and learned a lot! |
…h#47974) Summary: The approach is to simply reuse `torch.repeat` but adding one more functionality to tile, which is to prepend 1's to reps arrays if there are more dimensions to the tensors than the reps given in input. Thus for a tensor of shape (64, 3, 24, 24) and reps of (2, 2) will become (1, 1, 2, 2), which is what NumPy does. I've encountered some instability with the test on my end, where I could get a random failure of the test (due to, sometimes, random value of `self.dim()`, and sometimes, segfaults). I'd appreciate any feedback on the test or an explanation for this instability so I can this. Pull Request resolved: pytorch#47974 Reviewed By: ngimel Differential Revision: D25148963 Pulled By: mruberry fbshipit-source-id: bf63b72c6fe3d3998a682822e669666f7cc97c58
The approach is to simply reuse
torch.repeatbut adding one more functionality to tile, which is to prepend 1's to reps arrays if there are more dimensions to the tensors than the reps given in input. Thus for a tensor of shape (64, 3, 24, 24) and reps of (2, 2) will become (1, 1, 2, 2), which is what NumPy does.I've encountered some instability with the test on my end, where I could get a random failure of the test (due to, sometimes, random value of
self.dim(), and sometimes, segfaults). I'd appreciate any feedback on the test or an explanation for this instability so I can this.