3

I want to fine tune ALBERT.

I see one can distribute neural net training over multiple gpus using tensorflow: https://www.tensorflow.org/guide/distributed_training

I was wondering if it's possible to distribute fine-tuning across both my laptop's gpu and a colab gpu?

Gog
  • 93
  • 6

1 Answers1

1

I don't think that's possible. Because in order to do GPU distributed training, you need NVLinks among your GPUs. You don't have such a link between your laptop's GPU and Colab GPUs. This is a good read https://lambdalabs.com/blog/introduction-multi-gpu-multi-node-distributed-training-nccl-2-0/

pedram bashiri
  • 1,286
  • 15
  • 21