Skip to content

Make from_dlpack handle cuda synchronization implicitly for input tensors that have __dlpack__ and __dlpack_device__ attributes.#7125

Merged
vanbasten23 merged 6 commits intomasterfrom
xiowei/use_producer_dlpack_device
May 30, 2024
Merged

Make from_dlpack handle cuda synchronization implicitly for input tensors that have __dlpack__ and __dlpack_device__ attributes.#7125
vanbasten23 merged 6 commits intomasterfrom
xiowei/use_producer_dlpack_device

Conversation

@vanbasten23
Copy link
Copy Markdown
Collaborator

@vanbasten23 vanbasten23 commented May 28, 2024

from_dlpack should leverage __dlpack__ and __dlpack_device__ attributes of a tensor. That way, from_dlpack will handle the cuda synchronization implicity. Similar approaches can be found in pytorch and jax.

This PR fixes the direction when we convert an external tensor such as cuda tensor to an XLA tensor via from_dlpack. For the other direction (convert an XLA tensor to CUDA tensor) requires a change in upstream pytorch.

@vanbasten23 vanbasten23 changed the title Handle cuda synchronize implicitly Make from_dlpack handle cuda synchronize implicitly for input tensors that have __dlpack__ and __dlpack_device__ attributes. May 28, 2024
@vanbasten23 vanbasten23 changed the title Make from_dlpack handle cuda synchronize implicitly for input tensors that have __dlpack__ and __dlpack_device__ attributes. Make from_dlpack handle cuda synchronization implicitly for input tensors that have __dlpack__ and __dlpack_device__ attributes. May 28, 2024
@JackCaoG
Copy link
Copy Markdown
Collaborator

Let me know when this is ready for review

@vanbasten23
Copy link
Copy Markdown
Collaborator Author

Let me know when this is ready for review

Thanks. It's ready for review.

@vanbasten23 vanbasten23 marked this pull request as ready for review May 28, 2024 20:09
Comment thread test/test_operations.py Outdated
Comment thread torch_xla/utils/dlpack.py
@vanbasten23 vanbasten23 force-pushed the xiowei/use_producer_dlpack_device branch from d42d453 to c489851 Compare May 29, 2024 23:42
@vanbasten23 vanbasten23 requested a review from JackCaoG May 30, 2024 16:39
Comment thread test/test_operations.py
Comment thread torch_xla/utils/dlpack.py
Copy link
Copy Markdown
Collaborator

@JackCaoG JackCaoG left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

mostly lgtm

@vanbasten23
Copy link
Copy Markdown
Collaborator Author

Thanks Jack for the review!

@vanbasten23 vanbasten23 merged commit daada22 into master May 30, 2024
@vanbasten23 vanbasten23 mentioned this pull request May 31, 2024
yitongh pushed a commit to AlibabaPAI/xla that referenced this pull request Oct 11, 2024
…sors that have __dlpack__ and __dlpack_device__ attributes. (pytorch#7125)
yitongh pushed a commit to AlibabaPAI/xla that referenced this pull request Dec 11, 2024
…sors that have __dlpack__ and __dlpack_device__ attributes. (pytorch#7125)
yitongh pushed a commit to AlibabaPAI/xla that referenced this pull request Dec 11, 2024
…sors that have __dlpack__ and __dlpack_device__ attributes. (pytorch#7125)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants