1

I am working in the StableBaselines3 package. I know that I can make learning rate schedule by inputting a function to the "learning_rate" argument. However, what I want to be able to do is adaptively decrease the learning rate. Instead of fine-tuning a new LR schedule for every situation, I want the learning rate to "automatically" decrease a certain amount when the performance hits a plateau/stops increasing.

This post here explores how to do this in PyTorch. I know SB3 is built on PyTorch, so is this same function usable somehow in SB3?

Is there any elegant way to do this in SB3? Or would I have to just stop the training periodically, run a check, implement new LR, run again etc.?

Thank you!

Vladimir Belik
  • 280
  • 1
  • 12

1 Answers1

1

I haven't done it myself, but I'm deaing with SB3 callbacks these days and it should be possible to implement a custom callback that monitors the performance and then changes the model's learning rate based on that.

You can use callbacks to access internal state of the RL model during training. It allows one to do monitoring, auto saving, model manipulation, progress bars, …

See here https://stable-baselines3.readthedocs.io/en/master/guide/callbacks.html

GHE
  • 75
  • 4