0

• Installed Python 3.7.2 • Created GCP service account and given owner role to it, also enabled storage API and created a cloud storage bucket • Now I’m trying to upload files to GCP cloud storage folder using python script but I couldn’t. But, by using the same structure, I’m able to create new cloud storage bucket and able to edit existing files in it • Here with have attached pythonscript

Ref used: https://googleapis.github.io/google-cloud-python/latest/storage/blobs.html https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python

from google.cloud import storage

bucket_name='buckettest'
source_file_name='D:/file.txt'
source_file_name1='D:/jenkins structure.png'
destination_blob_name='test/'

def upload_blob(bucket_name, source_file_name, destination_blob_name):
    """Uploads a file to the bucket."""
    client = storage.Client.from_service_account_json('D:\gmailseviceaccount.json')
    bucket = client.create_bucket('bucketcreate')
    bucket = client.get_bucket(bucket_name)
    blob = bucket.blob(destination_blob_name)
    blob.upload_from_filename(source_file_name) 
    blob.upload_from_filename(source_file_name1)

    print('File {} uploaded to {}.'.format(
        source_file_name,
        destination_blob_name))

if __name__ == '__main__':
        upload_blob(bucket_name, source_file_name, destination_blob_name)
Yasho R
  • 1
  • 1
  • 2
  • Hello @Yasho R, welcome to StackOverflow! Which roles has your service account granted? To get them run [this command](https://stackoverflow.com/a/50485552/7757976) and edit the question with the result. – llompalles Mar 14 '19 at 12:02
  • What is the error message (stack trace)? – John Hanley Mar 14 '19 at 20:34

1 Answers1

3

I was able to run your code and debug it. I will put what I used below and explain the changes I made.

As you did, I put my service account as Owner and was able to upload. I recommend following the best practices of least privileges when your done testing.

  1. I removed client.create_bucket since buckets are unique we shouldn't be hard coding bucket names to create. You can come up with a naming convention for your needs, however, for testing I removed it.
  2. I fixed the variable destination_blob_name since you were using it as a folder for the file to be placed. This would not work as GCS does not use folders, it instead just uses file names. What was happening is that you were actually "converting" your TXT files into a folder named 'test'. For a better understanding, I recommend looking through the documentation on How Sub-directories Work.

    from google.cloud import storage
    
    bucket_name='bucket-test-18698335'
    source_file_name='./hello.txt'
    destination_blob_name='test/hello.txt'
    
    def upload_blob(bucket_name, source_file_name, destination_blob_name):
        """Uploads a file to the bucket."""
        client = storage.Client.from_service_account_json('./test.json')
        bucket = client.get_bucket(bucket_name)
        blob = bucket.blob(destination_blob_name)
        blob.upload_from_filename(source_file_name) 
    
        print('File {} uploaded to {}.'.format(
            source_file_name,
            destination_blob_name))
    
    if __name__ == '__main__':
            upload_blob(bucket_name, source_file_name, destination_blob_name)
    
ZUKINI
  • 195
  • 2
  • 15