I would like to copy a huge file from Google Drive to Google Cloud Storage Bucket (file size ~80GB).
There have a very nice Google Colab from Philip Lies (found here) that does the job for small files, but it is problem to huge files, once it seems to create a local cache before copy the file itself to the bucket, and the Colab storage is limited.
The copy it self seems to be quick (once everything is in the cloud and between Google's solutions) but once the Colab has a limited storage, it reaches storage limit before complete the copy. I would not like to use a local Colab, because it would need to download the file to my computer, then upload to Google Storage Bucket, which would takes too long.
As far as I understand, we have two approaches to do the copy of huge files from Google Drive to Google Cloud Storage:
A. Copy file into chunks
Copy files in chunks (let's say of 150MB) from Google Drive to Google Colab, then upload the chunk to Google Bucket.
But I didn't find how to do this with these storages. shutil
seems to do the job, but I could not make it works with Google Storage bucket.
B. Stream the file from the G Drive directly to G Bucket (ideally)
If there have an way to "stream" the file from one store to another it would be ideal, but I'm not sure if this approach is possible.