1

I want to create a zip file with all files present inside a bucket folder and write this zip file back to google cloud storage. I want to do this with app engine standard environment but i didn't find a good example for doing this.

  • Hi user2494876 and welcome to Stack Overflow! What have you tried so far? – Max von Hippel May 31 '18 at 15:46
  • 1
    At the moment i'm looking for the feasibility, because app engine don't have the possibility to write to a local disk, i don't know if it's possible to download all the files before creating the zip – user2494876 May 31 '18 at 16:12
  • Not with App engine, what i did was - Create a Cloud Function, that is triggered on new file creation in the bucket That function, fetches AND create a zip using packages like [jszip](https://www.npmjs.com/package/jszip), then add this zip to storage Note- If you want to generate each time you want, just change the trigger to HTTP requests – AdityaG15 Jan 12 '21 at 06:02

1 Answers1

0

If the size of the writable temporary file normally needed during the zip file creation could fit in the available memory of your instance class you may be able to use the StringIO facility and avoid writing to the filesystem. See for an example How to zip or tar a static folder without writing anything to the filesystem in python?

It may also be possible to directly write the zip file to GCS, basically using the GAE app as a pipeline, which might circumvent the available instance memory limitation mentioned above, but you'd have to try it out, I don't have an actual example. The tricks to watch for would be the picking the right file handlers arguments and maybe buffering options. An example of directly accessing a GCS file (only you'd want to write to it instead of reading from it) would be How to open gzip file on gae cloud?

Dan Cornilescu
  • 39,470
  • 12
  • 57
  • 97