2

I have some large binary files stored in Google Cloud Storage that require byte by byte processing. I would like to do this processing using a Cloud Function, but this would require streaming the binary data through the cloud function. Does anybody have any idea how to do this. I'm using Python 3.7.

AlMacOwl
  • 61
  • 1
  • 7
  • Here there is of example on how to stream data from storage to BigQuery with Cloud Functions - https://cloud.google.com/solutions/streaming-data-from-cloud-storage-into-bigquery-using-cloud-functions . I do not recommend using Cloud Functions to process large files though. I would rather recommend other serverless options as App Engine, or Cloud Run. Large files processing may take a while and Cloud Functions timeout after a certain time. – Andrei Tigau Feb 18 '20 at 08:56
  • 1
    Thanks for the great advice. I did manage to get something working using GCSFS which allows me to open files in cloud storage and stream binary as if they were local. Your comment about large files and timeouts has me worried, may have to rethink this. – AlMacOwl Feb 20 '20 at 00:18

0 Answers0