0

I have 10001920 images.And their name is train_0, train_1, .... I tried to copy them like

!gsutil -m cp -r /content/train/* gs://{my_bucket_name}/data

And I failed b.c it was too long. So I decided to use wild card like

!gsutil -m cp -r /content/train/train_1????.png gs://{my_bucket_name}/data

And I wanted to upload iterative way. After using 'for statement' to generate command line,

for script in script_list:
    os.system(script)

And returns

31512

I just wanna know how can I upload those huge files to GCS. Please give me some ideas

K.S Kim
  • 77
  • 4

1 Answers1

0

I don't think * should be used. It's not used that way in the documentation. I'd just try:

!gsutil -m cp -r ./content/train gs://{my_bucket_name}/data

This explains the failure number:

Also, although most commands normally fail upon encountering an error when the -m flag is disabled, all commands continue to try all operations when -m is enabled with multiple threads or processes, and the number of failed operations (if any) are reported as an exception at the end of the command's execution.

BeRT2me
  • 12,699
  • 2
  • 13
  • 31
  • Hi, appreciate for your suggestion. I'll try it. But in my opinion, there's too many files in train folder. So I would not work at this moment. – K.S Kim Aug 01 '22 at 23:43