Skip to content

Restore cuda variable.bernoulli()#4787

Merged
ezyang merged 1 commit intopytorch:masterfrom
zou3519:bernoulli-cuda-var
Jan 24, 2018
Merged

Restore cuda variable.bernoulli()#4787
ezyang merged 1 commit intopytorch:masterfrom
zou3519:bernoulli-cuda-var

Conversation

@zou3519
Copy link
Copy Markdown
Contributor

@zou3519 zou3519 commented Jan 22, 2018

Fixes #4715

Variable bernoulli() was previously set to only CPU in Declarations.cwrap. This PR enables the cuda backend for that.

Test Plan

new unit test

@pytorchbot
Copy link
Copy Markdown
Collaborator

@zou3519, thanks for your PR! We identified @zdevito to be a potential reviewer.

Copy link
Copy Markdown
Member

@colesbury colesbury left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@vadimkantorov
Copy link
Copy Markdown
Contributor

vadimkantorov commented Jan 22, 2018

I checked Declarations.cwrap for CPU-only functions, some are known (kthvalue, standard_gamma, some matrix ops), but maybe this is new one:

torch.randperm(5, out = torch.LongTensor()) # works
torch.randperm(5, out = torch.autograd.Variable(torch.LongTensor())) # works

torch.randperm(5, out = torch.cuda.LongTensor()) # fails
# TypeError: Type torch.cuda.LongTensor doesn't implement stateless method randperm
torch.randperm(5, out = torch.autograd.Variable(torch.cuda.LongTensor())) # fails
# RuntimeError: randperm_out is not implemented for type torch.cuda.LongTensor

@zou3519
Copy link
Copy Markdown
Contributor Author

zou3519 commented Jan 22, 2018

@vadimkantorov there's currently no CUDA implementation of randperm

@ezyang ezyang merged commit c7a2e31 into pytorch:master Jan 24, 2018
facebook-github-bot pushed a commit that referenced this pull request Aug 7, 2020
…#4787)

Summary:
Pull Request resolved: pytorch/glow#4787

Resurrect ONNX as a backend through onnxifiGlow (was killed as part of D16215878). Then look for the `use_glow_aot` argument in the Onnxifi op. If it's there and true, then we override whatever `backend_id` is set and use the ONNX backend.

Reviewed By: yinghai, rdzhabarov

Differential Revision: D22762123

fbshipit-source-id: abb4c3458261f8b7eeae3016dda5359fa85672f0
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
…pytorch#4787)

Summary:
Pull Request resolved: pytorch/glow#4787

Resurrect ONNX as a backend through onnxifiGlow (was killed as part of D16215878). Then look for the `use_glow_aot` argument in the Onnxifi op. If it's there and true, then we override whatever `backend_id` is set and use the ONNX backend.

Reviewed By: yinghai, rdzhabarov

Differential Revision: D22762123

fbshipit-source-id: abb4c3458261f8b7eeae3016dda5359fa85672f0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants