88

Is it possible in PyTorch to change the learning rate of the optimizer in the middle of training dynamically (I don't want to define a learning rate schedule beforehand)?

So let's say I have an optimizer:

optim = torch.optim.SGD(model.parameters(), lr=0.01)

Now due to some tests which I perform during training, I realize my learning rate is too high so I want to change it to say 0.001. There doesn't seem to be a method optim.set_lr(0.001) but is there some way to do this?

patapouf_ai
  • 17,605
  • 13
  • 92
  • 132

2 Answers2

170

So the learning rate is stored in optim.param_groups[i]['lr']. optim.param_groups is a list of the different weight groups which can have different learning rates. Thus, simply doing:

for g in optim.param_groups:
    g['lr'] = 0.001

will do the trick.


**Alternatively**,

as mentionned in the comments, if your learning rate only depends on the epoch number, you can use a learning rate scheduler.

For example (modified example from the doc):

torch.optim.lr_scheduler import LambdaLR
optimizer = torch.optim.SGD(model.parameters(), lr=0.1, momentum=0.9)
# Assuming optimizer has two groups.
lambda_group1 = lambda epoch: epoch // 30
lambda_group2 = lambda epoch: 0.95 ** epoch
scheduler = LambdaLR(optimizer, lr_lambda=[lambda1, lambda2])
for epoch in range(100):
    train(...)
    validate(...)
    scheduler.step()

Also, there is a prebuilt learning rate scheduler to reduce on plateaus.

Quonux
  • 2,975
  • 1
  • 24
  • 32
patapouf_ai
  • 17,605
  • 13
  • 92
  • 132
19

Instead of a loop in patapouf_ai's answer, you can do it directly via:

optim.param_groups[0]['lr'] = 0.001
desertnaut
  • 57,590
  • 26
  • 140
  • 166
Michael
  • 2,167
  • 5
  • 23
  • 38