I want to fine tune ALBERT.
I see one can distribute neural net training over multiple gpus using tensorflow: https://www.tensorflow.org/guide/distributed_training
I was wondering if it's possible to distribute fine-tuning across both my laptop's gpu and a colab gpu?