Skip to content

[tune](deps): Bump pytorch-lightning from 1.0.3 to 1.3.8 in /python/requirements/tune#40

Closed
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/tune/pytorch-lightning-1.3.8
Closed

[tune](deps): Bump pytorch-lightning from 1.0.3 to 1.3.8 in /python/requirements/tune#40
dependabot[bot] wants to merge 1 commit intomasterfrom
dependabot/pip/python/requirements/tune/pytorch-lightning-1.3.8

Conversation

@dependabot
Copy link
Copy Markdown

@dependabot dependabot bot commented on behalf of github Jul 3, 2021

Bumps pytorch-lightning from 1.0.3 to 1.3.8.

Release notes

Sourced from pytorch-lightning's releases.

Standard weekly patch release

[1.3.8] - 2021-07-01

Fixed

  • Fixed a sync deadlock when checkpointing a LightningModule that uses a torchmetrics 0.4 Metric (#8218)
  • Fixed compatibility TorchMetrics v0.4 (#8206)
  • Added torchelastic check when sanitizing GPUs (#8095)
  • Fixed a DDP info message that was never shown (#8111)
  • Fixed metrics deprecation message at module import level (#8163)
  • Fixed a bug where an infinite recursion would be triggered when using the BaseFinetuning callback on a model that contains a ModuleDict (#8170)
  • Added a mechanism to detect deadlock for DDP when only 1 process trigger an Exception. The mechanism will kill the processes when it happens (#8167)
  • Fixed NCCL error when selecting non-consecutive device ids (#8165)
  • Fixed SWA to also work with IterableDataset (#8172)

Contributors

@​GabrielePicco @​SeanNaren @​ethanwharris @​carmocca @​tchaton @​justusschock

Hotfix Patch Release

[1.3.7post0] - 2021-06-23

Fixed

  • Fixed backward compatibility of moved functions rank_zero_warn and rank_zero_deprecation (#8085)

Contributors

@​kaushikb11 @​carmocca

Standard weekly patch release

[1.3.7] - 2021-06-22

Fixed

  • Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error (#7975) This conversation was marked as resolved by carmocca
  • Fixed deprecation messages not showing due to incorrect stacklevel (#8002, #8005)
  • Fixed setting a DistributedSampler when using a distributed plugin in a custom accelerator (#7814)
  • Improved PyTorchProfiler chrome traces names (#8009)
  • Fixed moving the best score to device in EarlyStopping callback for TPU devices (#7959)

Contributors

@​yifuwang @​kaushikb11 @​ajtritt @​carmocca @​tchaton

Standard weekly patch release

[1.3.6] - 2021-06-15

... (truncated)

Changelog

Sourced from pytorch-lightning's changelog.

[1.3.8] - 2021-07-01

Fixed

  • Fixed a sync deadlock when checkpointing a LightningModule that uses a torchmetrics 0.4 Metric (#8218)

  • Fixed compatibility TorchMetrics v0.4 (#8206)

  • Added torchelastic check when sanitizing GPUs (#8095)

  • Fixed a DDP info message that was never shown (#8111)

  • Fixed metrics deprecation message at module import level (#8163)

  • Fixed a bug where an infinite recursion would be triggered when using the BaseFinetuning callback on a model that contains a ModuleDict (#8170)

  • Added a mechanism to detect deadlock for DDP when only 1 process trigger an Exception. The mechanism will kill the processes when it happens (#8167)

  • Fixed NCCL error when selecting non-consecutive device ids (#8165)

  • Fixed SWA to also work with IterableDataset (#8172)

  • Fixed a bug where truncated_bptt_steps would throw an AttributeError when the target RNN has multiple hidden states (#8145)

  • Fixed passing a custom DDPPlugin when choosing accelerator="ddp_cpu" for the accelerator (#6208)

[1.3.7] - 2021-06-22

Fixed

  • Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error (#7975)
  • Fixed deprecation messages not showing due to incorrect stacklevel (#8002, #8005)
  • Fixed setting a DistributedSampler when using a distributed plugin in a custom accelerator (#7814)
  • Improved PyTorchProfiler chrome traces names (#8009)
  • Fixed moving the best score to device in EarlyStopping callback for TPU devices (#7959)

[1.3.6] - 2021-06-15

Fixed

  • Fixed logs overwriting issue for remote filesystems (#7889)
  • Fixed DataModule.prepare_data could only be called on the global rank 0 process (#7945)
  • Fixed setting worker_init_fn to seed dataloaders correctly when using DDP (#7942)
  • Fixed BaseFinetuning callback to properly handle parent modules w/ parameters (#7931)
  • Fixes access to callback_metrics in ddp_spawn (#7916)

[1.3.5] - 2021-06-08

Added

  • Added warning to Training Step output (#7779)

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Jul 3, 2021
Bumps [pytorch-lightning](https://github.com/PyTorchLightning/pytorch-lightning) from 1.0.3 to 1.3.8.
- [Release notes](https://github.com/PyTorchLightning/pytorch-lightning/releases)
- [Changelog](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)
- [Commits](Lightning-AI/pytorch-lightning@1.0.3...1.3.8)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot force-pushed the dependabot/pip/python/requirements/tune/pytorch-lightning-1.3.8 branch from e0b5d8c to 8229980 Compare July 13, 2021 18:19
@dependabot @github
Copy link
Copy Markdown
Author

dependabot bot commented on behalf of github Jul 31, 2021

Superseded by #44.

@dependabot dependabot bot closed this Jul 31, 2021
@dependabot dependabot bot deleted the dependabot/pip/python/requirements/tune/pytorch-lightning-1.3.8 branch July 31, 2021 07:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants