I cannot find a way to to write a data set from my local machine into the google cloud storage using python. I have researched a a lot but didn't find any clue regarding this. Need help, thanks

- 1,171
- 4
- 12

- 823
- 1
- 8
- 7
-
did you ever found a way? it seems people tends to confuse the upload with an actual write – Manza Aug 18 '18 at 01:49
-
Does this answer your question? [How to upload a file to Google Cloud Storage on Python 3?](https://stackoverflow.com/questions/37003862/how-to-upload-a-file-to-google-cloud-storage-on-python-3) – ggorlen Jan 26 '23 at 23:44
5 Answers
Quick example, using the google-cloud Python library:
from google.cloud import storage
def upload_blob(bucket_name, source_file_name, destination_blob_name):
"""Uploads a file to the bucket."""
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_blob_name)
blob.upload_from_filename(source_file_name)
print('File {} uploaded to {}.'.format(
source_file_name,
destination_blob_name))
More examples are in this GitHub repo: https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/storage/cloud-client

- 37,021
- 23
- 116
- 145
-
-
base on this link https://cloud.google.com/appengine/docs/standard/python/googlecloudstorageclient/read-write-to-cloud-storage (from what I understood) this will allow you to create the file in the bucket, instead of uploading – Manza Aug 20 '18 at 11:13
-
Creating a file by specifying its contents is the same operation as uploading a new file. It's just called a different name here. – Brandon Yarbrough Aug 20 '18 at 21:30
-
my bad, I though in the upload you create a file and then upload, in the other i though you actually create the file directly in the bucket folder, for hence never in your server, I am having issues as appengine doesnt allows me to create files so i am trying to find that solution – Manza Aug 20 '18 at 23:29
In the earlier answers, I still miss the easiest way, using the open()
method.
You can use the blob.open()
as follows:
from google.cloud import storage
def write_file():
client = storage.Client()
bucket = client.get_bucket('bucket-name')
blob = bucket.blob('path/to/new-blob-name.txt')
## Use bucket.get_blob('path/to/existing-blob-name.txt') to write to existing blobs
with blob.open(mode='w') as f:
for line in object:
f.write(line)
You can find more examples and snippets here: https://github.com/googleapis/python-storage/tree/main/samples/snippets
When we want to write a string to a GCS bucket blob, the only change necessary is using blob.upload_from_string(your_string)
rather than blob.upload_from_filename(source_file_name)
:
from google.cloud import storage
def write_to_cloud(your_string):
client = storage.Client()
bucket = client.get_bucket('bucket123456789')
blob = bucket.blob('PIM.txt')
blob.upload_from_string(your_string)
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)
filename = 'file.csv'
bucket = 'Your bucket name here'
body = {'name': 'file.csv'}
req = service.objects().insert(bucket=bucket, body=body, media_body=filename)
resp = req.execute()

- 44,755
- 7
- 76
- 106

- 31
- 1
from google.cloud import storage
def write_to_cloud(buffer):
client = storage.Client()
bucket = client.get_bucket('bucket123456789')
blob = bucket.blob('PIM.txt')
blob.upload_from_file(buffer)
While Brandon's answer indeed gets the file to Google cloud, it does this by uploading the file, as opposed to writing the file. This means that the file needs to exist on your disk before you upload it to the cloud.
My proposed solution uses an "in-memory" payload (the buffer
parameter) which is then written to cloud. To write the content you need to use upload_from_file
instead of upload_from_filename
, everything else being the same.