How to sync learning rate of lr_scheduler with that of optimizer #2461
Replies: 1 comment
-
I should call def step(self, epoch: int, metric: float = None) -> None:
self.metric = metric
values = self._get_values(epoch, on_epoch=True)
if values is not None:
values = self._add_noise(values, epoch)
self.update_groups(values)
def step_update(self, num_updates: int, metric: float = None):
self.metric = metric
values = self._get_values(num_updates, on_epoch=False)
if values is not None:
values = self._add_noise(values, num_updates)
self.update_groups(values) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey, I'm trying to use CosineLRScheduler, but I noticed the optimizer learning rate isn't in sync with lr_scheduler. The optimizer learning rate I'm getting is warmup_lr_init. The learning rate calculated by lr_scheduler seems correct. Here's my sample for checking the learning rate. And I follow the same way to train my model.
Beta Was this translation helpful? Give feedback.
All reactions