Skip to content

[dist_optim] add warning to distributed optimizer#50630

Closed
wanchaol wants to merge 8 commits intogh/wanchaol/157/basefrom
gh/wanchaol/157/head
Closed

[dist_optim] add warning to distributed optimizer#50630
wanchaol wants to merge 8 commits intogh/wanchaol/157/basefrom
gh/wanchaol/157/head

Conversation

@wanchaol
Copy link
Copy Markdown
Collaborator

@wanchaol wanchaol commented Jan 15, 2021

Stack from ghstack:

Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: D25932777

Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

[ghstack-poisoned]
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 15, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: 2e272bd
Pull Request resolved: #50630
Comment thread torch/distributed/optim/optimizer.py Outdated
"slow computation time in multithreading environment (i.e. Distributed Model "
"Parallel training on CPU) due to the python Global Interpreter Lock (GIL). "
"Please file an issue if you need this optimizer in TorchScript. ",
str(optimizer_class)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: use f-string

Comment thread torch/distributed/optim/optimizer.py Outdated
optimizer_new_func = _new_script_local_optimizer
else:
logger.warn(
"Creating the optimizer %s without TorchScript support, this might results in "
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"this might results" -> "this might result"

Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 20, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: ddaa6ce
Pull Request resolved: #50630
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 20, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: 22243c9
Pull Request resolved: #50630
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 21, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: a3fa708
Pull Request resolved: #50630
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 23, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: 4eafa08
Pull Request resolved: #50630
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Differential Revision: [D25932777](https://our.internmc.facebook.com/intern/diff/D25932777)

[ghstack-poisoned]
wanchaol added a commit that referenced this pull request Jan 26, 2021
Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

ghstack-source-id: 678a4ce
Pull Request resolved: #50630
@facebook-github-bot
Copy link
Copy Markdown
Contributor

@wanchaol merged this pull request in 3562ca2.

@facebook-github-bot facebook-github-bot deleted the gh/wanchaol/157/head branch January 30, 2021 15:21
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Pull Request resolved: pytorch#50630

Add a warning log to distributed optimizer, to warn user the optimizer
is created without TorchScript support.

Test Plan: Imported from OSS

Reviewed By: rohan-varma

Differential Revision: D25932777

Pulled By: wanchaol

fbshipit-source-id: 8db3b98bdd27fc04c5a3b8d910b028c0c37f138d
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged oncall: distributed Add this issue/PR to distributed oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants