Skip to content

Revert "torch.set_num_threads sets MKL option too"#4967

Merged
soumith merged 6 commits intopytorch:masterfrom
ssnl:revert-4949-mklthreads
Jan 31, 2018
Merged

Revert "torch.set_num_threads sets MKL option too"#4967
soumith merged 6 commits intopytorch:masterfrom
ssnl:revert-4949-mklthreads

Conversation

@ssnl
Copy link
Copy Markdown
Collaborator

@ssnl ssnl commented Jan 31, 2018

Reverts #4949

The recent addition of setting MKL num threads in torch.set_num_threads in multiprocessing on cuda CI machines causes segfaults. Our dataloader workers indeed do torch.set_num_threads(1) before the loading loop. Let's revert this first to unblock the PRs.

@soumith soumith merged commit f2d3f20 into pytorch:master Jan 31, 2018
@ssnl ssnl deleted the revert-4949-mklthreads branch January 31, 2018 20:39
@apaszke
Copy link
Copy Markdown
Contributor

apaszke commented Jan 31, 2018

Hey, why did you revert multiple other PRs?

laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
* Revert "Clarify grad_input_mask documentation in derivatives.yaml (pytorch#4963)"

This reverts commit 37c8454.

* Revert "fix triu and tril for zero-strided inputs on gpu (pytorch#4962)"

This reverts commit 9acb9be.

* Revert "Add mutex for CPU RNG and move TH to C++ (pytorch#4041)"

This reverts commit 07b7fe2.

* Revert "Support multivariate TransformedDistributions (pytorch#4937)"

This reverts commit 18bdb4a.

* Revert "Only check that arguments are Variables in VariableType (pytorch#4943)"

This reverts commit a479ca0.

* Revert "torch.set_num_threads sets MKL option too (pytorch#4949)"

This reverts commit 2cfb339.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants