I am currently running bash commands in python script using os.system as the following:
for bucket in bucket_lst:
start = time.time()
command = "gsutil rsync -r /home/imagenet/tf_records " + bucket
os.system(command)
end = time.time() - start
time_lst.append(end)
What I'm doing here is to transfer the data from a Google Compute Engine to Google Cloud Storage in diverse regions, which the regions are stored in "bucket_lst," and measure the time taken to finish the transfer to each region.
Each transfer to a region takes about an hour to two hours, and there are about 30 regions, so I need to run this process in the background with nohup as the ssh connection to the GCE gets disconnected often.
Currently, I tried the command "nohup python3 gce_to_gcs_throuhgput.py", but it seems like it ends the process after running the very first iteration of the command executed by the for-loop. Why is this happening and how can I fix things so the nohup command runs until it transfers the data to every regions?