I just want to know that because of some the invocations are working and some are not. I couldn't find any official documentation regarding time delay or restriction in quotas provided by Google.
For example: I created a dataframe with the following code:
empty_df = pd.DataFrame(val)
empty_df.to_csv('/tmp/{}.csv'.format(SAMPLE))
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob('FOLDER1/{}.csv'.format(SAMPLE))
blob.upload_from_filename('/tmp/{}.csv'.format(SAMPLE))
The SAMPLE
variable that I defined in the code has been changing on every loop. I ran it in for loop and also Cloud Functions triggered multiple times (between 1 up to 50, or more than 50). Until this point everything looks fine. After the function is completed I can not see some of the CSV files in the 'FOLDER1' folder. And I also have the same problem in the copy_blob function.
For example: I want to move CSV files from FOLDER1 to FOLDER2 with new name that I created with above codes. Some of the CSV files are not appearing in FOLDER2 and also sending me 404 not found files error in logs. But when I manually checked the buckets, I can see the files there.
def copy_blob(
bucket_name, blob_name, destination_bucket_name, destination_blob_name,
status_path, delete_blob = None
):
"""Copies a blob from one bucket to another with a new name."""
storage_client = storage.Client()
source_bucket = storage_client.bucket(bucket_name)
source_blob = source_bucket.blob(blob_name)
destination_bucket = storage_client.bucket(destination_bucket_name)
blob_copy = source_bucket.copy_blob(
source_blob, destination_bucket, destination_blob_name
)
#delete old blob
if delete_blob == True :
source_blob.delete()
else:
pass
print(
"Blob {} in bucket {} copied to blob {} in bucket {}.".format(
source_blob.name,
source_bucket.name,
blob_copy.name,
destination_bucket.name,
)
)
I used to that code to move the files. Does anyone have an idea?