I am using PyTorch and I want to change the learning rate after some epochs.
However, the code that is provided on most documentations, which is:
optimizer = torch.optim.Adam([
dict(params=model.parameters(), lr=learning_rate),
])
#This line specifically
optimizer.params_group[0]['lr'] = learning_rate
does not work.
Actually PyCharm hints at it:
Unresolved attribute reference 'params_group' for class 'Adam'
As a result, the error thrown is:
AttributeError: 'Adam' object has no attribute 'params_group'
How should one manually change the learning rate in PyTorch (1.6)?