allow concatenating "hybrid" (sparse/dense) tensors along their dense dimensions#13761
allow concatenating "hybrid" (sparse/dense) tensors along their dense dimensions#13761umanwizard wants to merge 7 commits intopytorch:masterfrom
Conversation
|
hmm... I thought my tests were passing, looks like I probably screwed something up here. Closing until I figure out how to fix it. |
|
should be fixed now |
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
a1b3a08 to
acc06d4
Compare
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
aten/src/ATen/native/TensorShape.cpp
Outdated
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
weiyangfb
left a comment
There was a problem hiding this comment.
LGTM! Please address some minor nits
acc06d4 to
7d5a525
Compare
facebook-github-bot
left a comment
There was a problem hiding this comment.
@umanwizard has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
facebook-github-bot
left a comment
There was a problem hiding this comment.
@umanwizard has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
… dimensions (#13761) Summary: Follow-up to #13577 The idea is to take each values tensor, concatenate it with zeros before and after itself (along the dimension corresponding to the one we're catting the tensors along), to get a tensor corresponding to the values for that tensor in the result. Then we concatenate all of those together to get the final values tensor. (Hopefully, this will be more clear from the example in the comments). The indices are more straightforward: since we aren't concatenating along a sparse dimension, they don't change at all, so all we need to do are concatenate the indices from the different tensors together. Pull Request resolved: pytorch/pytorch#13761 Differential Revision: D13160343 Pulled By: umanwizard fbshipit-source-id: 13d7adecd369e0eebdf5bce3d90a51029b66bd1d
Follow-up to #13577
The idea is to take each values tensor, concatenate it with zeros before and after itself (along the dimension corresponding to the one we're catting the tensors along), to get a tensor corresponding to the values for that tensor in the result. Then we concatenate all of those together to get the final values tensor. (Hopefully, this will be more clear from the example in the comments).
The indices are more straightforward: since we aren't concatenating along a sparse dimension, they don't change at all, so all we need to do are concatenate the indices from the different tensors together.