-
Notifications
You must be signed in to change notification settings - Fork 24.9k
Open
Labels
high prioritymodule: LrSchedulermodule: deprecationtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
🐛 Describe the bug
This is probably from sub-scheduler when SequentialLR calls them with epoch arg.
/usr/local/lib/python3.6/dist-packages/torch/optim/lr_scheduler.py:154: UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
Related code:
pytorch/torch/optim/lr_scheduler.py
Line 154 in 29a45f0
warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning) pytorch/torch/optim/lr_scheduler.py
Line 632 in 29a45f0
self._schedulers[idx].step(0)
Comment should also be fixed:
pytorch/torch/optim/lr_scheduler.py
Line 1276 in 29a45f0
>>> scheduler.step(26)
For compatibility with 3rd-party sub-scheduler who's still relying on epoch arg, you may:
- Add a member switch to temporarily suppress this warnings when called directly by SequentialLR, and revert it at the end of step().
- Or whitelist pytorch's own types with warnings in SequentialLR to use non-epoch signature.
You may also need to check the type ofself
before printing that warning, to keep it quiet on derived types.
Versions
Commit 29a45f0
jxtps, Leon5x, w86763777 and xw1216
Metadata
Metadata
Assignees
Labels
high prioritymodule: LrSchedulermodule: deprecationtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module