Skip to content

Fix logging on_train_batch_end in a callback with multiple optimizers#5521

Merged
carmocca merged 4 commits intoLightning-AI:masterfrom
carmocca:bug/5459
Jan 18, 2021
Merged

Fix logging on_train_batch_end in a callback with multiple optimizers#5521
carmocca merged 4 commits intoLightning-AI:masterfrom
carmocca:bug/5459

Conversation

@carmocca
Copy link
Copy Markdown
Contributor

@carmocca carmocca commented Jan 15, 2021

What does this PR do?

Fixes #5459

High-level explanation:
Before, we did range(num_optimizers) but this fails because epoch_metrics can contain data for just one optimizer.

For this reason, the fix is to loop using the epoch_metrics keys. list(epoch_metrics) (to use a copy of the keys) is necessary because keys are deleted inside the loop.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • [n/a] Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or internal minor changes/refactorings)

PR review

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified
  • Check that target branch and milestone match!

@carmocca carmocca self-assigned this Jan 15, 2021
@codecov
Copy link
Copy Markdown

codecov bot commented Jan 15, 2021

Codecov Report

Merging #5521 (6ffc2b0) into master (a56f745) will decrease coverage by 0%.
The diff coverage is 100%.

@@          Coverage Diff           @@
##           master   #5521   +/-   ##
======================================
- Coverage      93%     93%   -0%     
======================================
  Files         135     135           
  Lines       10007   10005    -2     
======================================
- Hits         9341    9339    -2     
  Misses        666     666           

Copy link
Copy Markdown
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

Copy link
Copy Markdown
Contributor

@ananthsub ananthsub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks for the fix!

@awaelchli awaelchli added bug Something isn't working logging Related to the `LoggerConnector` and `log()` labels Jan 18, 2021
@awaelchli awaelchli added this to the 1.1.x milestone Jan 18, 2021
@awaelchli awaelchli added the priority: 0 High priority task label Jan 18, 2021
@carmocca carmocca enabled auto-merge (squash) January 18, 2021 13:05
@carmocca carmocca merged commit 18d2ae8 into Lightning-AI:master Jan 18, 2021
@carmocca carmocca deleted the bug/5459 branch January 18, 2021 21:31
Borda pushed a commit that referenced this pull request Feb 4, 2021
…#5521)

* Start with the failing test

* Then fix the failing test

* Update CHANGELOG
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working logging Related to the `LoggerConnector` and `log()` priority: 0 High priority task

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] Logging in a callback does not work with multiple optimizers

4 participants