Skip to content

Fix memory leak in THCSTensor_spcadd.#1519

Merged
soumith merged 1 commit intopytorch:masterfrom
ezyang:leak-fix
May 15, 2017
Merged

Fix memory leak in THCSTensor_spcadd.#1519
soumith merged 1 commit intopytorch:masterfrom
ezyang:leak-fix

Conversation

@ezyang
Copy link
Contributor

@ezyang ezyang commented May 9, 2017

Signed-off-by: Edward Z. Yang ezyang@fb.com

Signed-off-by: Edward Z. Yang <ezyang@fb.com>
@apaszke
Copy link
Contributor

apaszke commented May 9, 2017

can you fix that unnecessary freeCopyTo as well?

@ezyang
Copy link
Contributor Author

ezyang commented May 9, 2017

Yes, working on it :)

@soumith soumith merged commit 0f458ee into pytorch:master May 15, 2017
@ezyang ezyang deleted the leak-fix branch September 7, 2017 20:23
jjsjann123 added a commit to jjsjann123/pytorch that referenced this pull request Mar 21, 2022
Fixes pytorch#1514

Issue arises when we try to collapse a dimension to a broadcasted dimension. PyTorch mark all dimension with stride 1 to be contiguous. This used to be fine until recently we changed it that broadcasted dimension with stride 0 can now be put as a faster dimension. So now we can be collapsing a normal dimension into a broadcasted dimension and our index is wrong, since broadcasted dimension has stride 0.

The solution is to explicitly check inner dimension stride for contiguous dimension and only collapse when inner dimension is not broadcasted.
jagadish-amd pushed a commit to jagadish-amd/pytorch that referenced this pull request Sep 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants