I am working in the StableBaselines3 package. I know that I can make learning rate schedule by inputting a function to the "learning_rate" argument. However, what I want to be able to do is adaptively decrease the learning rate. Instead of fine-tuning a new LR schedule for every situation, I want the learning rate to "automatically" decrease a certain amount when the performance hits a plateau/stops increasing.
This post here explores how to do this in PyTorch. I know SB3 is built on PyTorch, so is this same function usable somehow in SB3?
Is there any elegant way to do this in SB3? Or would I have to just stop the training periodically, run a check, implement new LR, run again etc.?
Thank you!