Skip to content

Avoid wrapping LightningModule in DDP plugins when not fitting#9096

Merged
ananthsub merged 2 commits intoLightning-AI:masterfrom
four4fish:feat/6977
Sep 2, 2021
Merged

Avoid wrapping LightningModule in DDP plugins when not fitting#9096
ananthsub merged 2 commits intoLightning-AI:masterfrom
four4fish:feat/6977

Conversation

@four4fish
Copy link
Copy Markdown
Contributor

@four4fish four4fish commented Aug 24, 2021

What does this PR do?

Pull changes from #8632

Fixes #6977

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

Copy link
Copy Markdown
Contributor

@ananthsub ananthsub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for carrying this forward!

Comment thread pytorch_lightning/plugins/training_type/ddp.py Outdated
Comment thread pytorch_lightning/plugins/training_type/ddp_spawn.py
Comment thread tests/plugins/test_ddp_plugin.py
@ananthsub ananthsub added distributed Generic distributed-related topic feature Is an improvement or enhancement labels Aug 24, 2021
@ananthsub ananthsub added this to the v1.5 milestone Aug 24, 2021
Comment thread CHANGELOG.md Outdated
Comment thread pytorch_lightning/plugins/training_type/ddp.py
Comment thread pytorch_lightning/plugins/training_type/ddp_spawn.py
@awaelchli awaelchli changed the title Avoid wrapping LightningModule in DDP plugins when not fitting Avoid wrapping LightningModule in DDP plugins when not fitting [1/2] Aug 25, 2021
@four4fish four4fish changed the title Avoid wrapping LightningModule in DDP plugins when not fitting [1/2] Avoid wrapping LightningModule in DDP plugins when not fitting Aug 25, 2021
Comment thread tests/plugins/test_sharded_plugin.py Outdated
@four4fish four4fish force-pushed the feat/6977 branch 2 times, most recently from 870790e to 1a2245a Compare August 25, 2021 21:28
@codecov
Copy link
Copy Markdown

codecov Bot commented Aug 25, 2021

Codecov Report

Merging #9096 (ca6ed38) into master (ff7305f) will decrease coverage by 4%.
The diff coverage is 67%.

@@           Coverage Diff           @@
##           master   #9096    +/-   ##
=======================================
- Coverage      92%     88%    -4%     
=======================================
  Files         176     176            
  Lines       14810   14817     +7     
=======================================
- Hits        13663   13050   -613     
- Misses       1147    1767   +620     

@awaelchli
Copy link
Copy Markdown
Contributor

@four4fish isn't this a subset of changes in #8632?

@four4fish
Copy link
Copy Markdown
Contributor Author

@four4fish isn't this a subset of changes in #8632?

@awaelchli no, I have commandeered the #8632, as Ning is focus on data loader changes ~~

Comment thread CHANGELOG.md Outdated
Comment thread tests/plugins/test_ddp_plugin.py Outdated
@mergify mergify Bot added the ready to be merged PRs ready to be merged label Aug 25, 2021
@awaelchli
Copy link
Copy Markdown
Contributor

awaelchli commented Aug 25, 2021

@four4fish are we closing #8632 then? the contents are virtually identical. at least we should mark one of them as draft

@four4fish
Copy link
Copy Markdown
Contributor Author

@four4fish are we closing #8632 then? the contents are virtually identical. at least we should mark one of them as draft

@awaelchli Yeah! Will close #8632. Sorry I should have marked it as commandeer PR, thanks :)!

Comment thread pytorch_lightning/plugins/training_type/ddp.py Outdated
Copy link
Copy Markdown
Contributor

@ananthsub ananthsub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i don't think we need to pass wrap_model into configure_ddp purely for the debug message. i'd rather it be that configure_ddp only wraps the module. duplicating the debug-line while keeping the configure_ddp interace simple doesn't seem like a bad option to me here

Comment thread pytorch_lightning/plugins/training_type/ddp.py Outdated
Comment thread pytorch_lightning/plugins/training_type/ddp_spawn.py Outdated
Comment thread pytorch_lightning/plugins/training_type/ddp_spawn.py Outdated
Comment thread pytorch_lightning/plugins/training_type/sharded.py Outdated
Copy link
Copy Markdown
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

Comment thread CHANGELOG.md Outdated
Comment thread pytorch_lightning/trainer/connectors/logger_connector/result.py
@mergify mergify Bot removed the has conflicts label Aug 31, 2021
Comment thread pytorch_lightning/plugins/training_type/ddp_spawn.py
@ananthsub ananthsub enabled auto-merge (squash) September 1, 2021 02:44
auto-merge was automatically disabled September 1, 2021 02:55

Head branch was pushed to by a user without write access

@ananthsub ananthsub enabled auto-merge (squash) September 1, 2021 03:26
Comment thread CHANGELOG.md Outdated
Comment thread tests/plugins/test_sharded_plugin.py Outdated
@ananthsub
Copy link
Copy Markdown
Contributor

@four4fish - the tests are failing because the DeepSpeed plugin extends the DDP plugin. However, for DeepSpeed, we should always wrap the model, and not override the training/validation/test/predict steps (ie. DeepSpeed should continue calling self.model(*args, **args)

This is another motivation for not relying on inheritance between these plugins, as both the wrapping and checkpoint behavior differ.

Comment thread tests/plugins/test_sharded_plugin.py Outdated
auto-merge was automatically disabled September 2, 2021 01:06

Head branch was pushed to by a user without write access

@ananthsub ananthsub enabled auto-merge (squash) September 2, 2021 01:13
@ananthsub ananthsub merged commit a451997 into Lightning-AI:master Sep 2, 2021
Comment thread CHANGELOG.md
- Fixed `DDP` "CUDA error: initialization error" due to a `copy` instead of `deepcopy` on `ResultCollection` ([#9239](https://github.com/PyTorchLightning/pytorch-lightning/pull/9239))


- Fixed wrapping issue: avoid wrapping LightningModule with data-parallel modules when not fitting in `DDPPlugin`, `DDPSpawnPlugin`, `DDPShardedPlugin`, `DDPSpawnShardedPlugin` ([#9096]https://github.com/PyTorchLightning/pytorch-lightning/pull/9096)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The link is broken

Can be fixed with another PR - no need to open one just for this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

distributed Generic distributed-related topic feature Is an improvement or enhancement ready to be merged PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Avoid wrapping LightningModule in *DataParallel overrides when not fitting

6 participants