Skip to content

Revert "Avoid unnecessary copy in TensorSource (#8849)"#9379

Merged
jeffhataws merged 1 commit intomasterfrom
jeffhataws_fix_9378
Jun 18, 2025
Merged

Revert "Avoid unnecessary copy in TensorSource (#8849)"#9379
jeffhataws merged 1 commit intomasterfrom
jeffhataws_fix_9378

Conversation

@jeffhataws
Copy link
Copy Markdown
Collaborator

This reverts commit 8dc5b49.

See #9378

@pytorch-bot pytorch-bot Bot added the ci-no-td label Jun 17, 2025
@jeffhataws jeffhataws requested review from lsy323, pgmoka and tengyifei and removed request for lsy323 and tengyifei June 17, 2025 17:54
@jeffhataws jeffhataws force-pushed the jeffhataws_fix_9378 branch from 0c2c622 to fd8a1b4 Compare June 17, 2025 17:56
@jeffhataws jeffhataws requested review from lsy323 and ysiraichi June 17, 2025 21:37
@tengyifei tengyifei removed their request for review June 17, 2025 23:01
Copy link
Copy Markdown
Collaborator

@ysiraichi ysiraichi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.
I will say that it's really odd that this is the thing that's hanging. If copy=False, it just means that this tensor.to(...) call might be a no-op. I don't really see how that would cause hanging.

@jeffhataws jeffhataws merged commit efdf117 into master Jun 18, 2025
24 checks passed
@jeffhataws jeffhataws deleted the jeffhataws_fix_9378 branch July 9, 2025 04:39
acolicTT added a commit to tenstorrent/pytorch-xla that referenced this pull request Feb 23, 2026
Whenever tensor is transfered to our runtime from torch_xla, tensor is copied.
This causes unnecessary data duplication on our side.

This change disables tensor copy if it can be borrowed.
Note that PJRT expects data to be contiguous in memory, which is not always
true on pytorch side, so copies can not be completely avoided.

Since this change has been checked in before and later reverted, we need to
test whether this will work in our environment.

Please take a look at these for more info:
pytorch#8849
pytorch#9379
pytorch#9378
acolicTT added a commit to tenstorrent/pytorch-xla that referenced this pull request Feb 23, 2026
Whenever tensor is transfered to our runtime from torch_xla, tensor is copied.
This causes unnecessary data duplication on our side.

This change disables tensor copy if it can be borrowed.
Note that PJRT expects data to be contiguous in memory, which is not always
true on pytorch side, so copies can not be completely avoided.

Since this change has been checked in before and later reverted, we need to
test whether this will work in our environment.

Please take a look at these for more info:
pytorch#8849
pytorch#9379
pytorch#9378
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants