Skip to content

[FR] Warn if scheduler.step() is called but optim.step has not been called #20124

@ssnl

Description

@ssnl

In 1.1 we made a major BC breaking change, where the order of calling lr schedulers should be changed from

for e in range(nepochs):
  scheduler.step()
  train()

to

for e in range(nepochs):
  train()
  scheduler.step()

This silently breaks many code, and makes it impossible to write consistent code for 1.0.1 and 1.1. So I propose to add a warning in scheduler.step where it looks at the corresponding optimizer, and checks if its .step has been called.

If it has not been called, this is a sign that the user is using scheduler.step() with the old pattern. I can't think of a reasonable case where this would detect a false positive.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNot as big of a feature, but technically not a bug. Should be easy to fixmodule: optimizerRelated to torch.optimtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions