docs: replace torch.distributed.run by torchrun#27528
docs: replace torch.distributed.run by torchrun#27528amyeroberts merged 2 commits intohuggingface:mainfrom
Conversation
|
I just found an old PR which did the similar work https://github.com/huggingface/transformers/pull/21780/files but out of sync. |
amyeroberts
left a comment
There was a problem hiding this comment.
Thanks for adding this!
Just a few comments on removing the changes for the research examples.
For any future readers who stumble on this PR: the previously closed PR requested comments in the docs for older pytorch versions. We now officially support pytorch >= 1.10. The entrypoint torchrun is present from 1.10 onwards.
examples/research_projects/self-training-text-classification/README.md
Outdated
Show resolved
Hide resolved
examples/research_projects/seq2seq-distillation/precomputed_pseudo_labels.md
Outdated
Show resolved
Hide resolved
`transformers` now officially support pytorch >= 1.10. The entrypoint `torchrun`` is present from 1.10 onwards. Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
comments already addressed, now PR can be merged now. |
with @ArthurZucker's suggestion Co-authored-by: Arthur <48595927+ArthurZucker@users.noreply.github.com>
What does this PR do?
The old way will be deprecated , move to new torchrun.
FutureWarning: The module torch.distributed.launch is deprecated and will be removed in future. Use torchrun.This PR only addresses the doc part, not touching the unit-test to limit the impact.
Fixes # (issue)
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.