2

I am using ReduceLROnPlateau to modify the learning rate during training of a PyTorch mode. ReduceLROnPlateau does not inherit from LRScheduler and does not implement the get_last_lr method which is PyTorch's recommended way of getting the current learning rate when using a learning rate scheduler.

How can I get the learning rate when using ReduceLROnPlateau?

Currently I am doing the following but am not sure if this is rigorous and correct:

lr = optimizer.state_dict()["param_groups"][0]["lr"]
Anil
  • 1,097
  • 7
  • 20

1 Answers1

3

You can skip the state_dict of the optimizer and access the learning rate directly:

optimizer.param_groups[0]["lr"]
Shai
  • 111,146
  • 38
  • 238
  • 371