37

How do I unzip a .zip file in Goolge Cloud Storage Bucket? (If we have some other tool like 'CloudBerry Explorer' for AWS, that will be great.)

Naaga A
  • 552
  • 1
  • 5
  • 13

11 Answers11

19

You can use Python, e.g. from a Cloud Function:

    from google.cloud import storage
    from zipfile import ZipFile
    from zipfile import is_zipfile
    import io

    def zipextract(bucketname, zipfilename_with_path):

        storage_client = storage.Client()
        bucket = storage_client.get_bucket(bucketname)

        destination_blob_pathname = zipfilename_with_path
        
        blob = bucket.blob(destination_blob_pathname)
        zipbytes = io.BytesIO(blob.download_as_string())

        if is_zipfile(zipbytes):
            with ZipFile(zipbytes, 'r') as myzip:
                for contentfilename in myzip.namelist():
                    contentfile = myzip.read(contentfilename)
                    blob = bucket.blob(zipfilename_with_path + "/" + contentfilename)
                    blob.upload_from_string(contentfile)

    zipextract("mybucket", "path/file.zip") # if the file is gs://mybucket/path/file.zip
davidmesalpz
  • 133
  • 6
Daniel Sparing
  • 2,163
  • 15
  • 20
  • 1
    Hi Daniel, Code is not working for me, Not able to go inside if condition. – Bhagesh Arora Sep 18 '19 at 14:02
  • copy pasted this in a colab notebook, just needed to add auth and specify the project and it worked like a charm – Sylvain Gantois Feb 24 '20 at 21:04
  • Hey I have been trying with my bucket gs://frames_final/frames_final_cat.zip and the function zipextract("frames_final", "frames_final_cat.zip") but am getting an error that project was not passed to environment. – yudhiesh Aug 24 '20 at 13:59
14

Here is some code I created to run as a Firebase Cloud Function. It is designed to listen to files loaded into a bucket with the content-type 'application/zip' and extract them in place.

    const functions = require('firebase-functions');
    const admin = require("firebase-admin");
    const path = require('path');
    const fs = require('fs');
    const os = require('os');
    const unzip = require('unzipper')

    admin.initializeApp();

    const storage = admin.storage();


    const runtimeOpts = {
      timeoutSeconds: 540,
      memory: '2GB'
    }

    exports.unzip = functions.runWith(runtimeOpts).storage.object().onFinalize((object) => {

        return new Promise((resolve, reject) => {
            //console.log(object)
            if (object.contentType !== 'application/zip') {
              reject();
            } else {
              const bucket = firebase.storage.bucket(object.bucket)
              const remoteFile = bucket.file(object.name)
              const remoteDir = object.name.replace('.zip', '')

              console.log(`Downloading ${remoteFile}`)

              remoteFile.createReadStream()
                .on('error', err => {
                  console.error(err)
                  reject(err);
                })
                .on('response', response => {
                  // Server connected and responded with the specified status and headers.
                  //console.log(response)
                })
                .on('end', () => {
                  // The file is fully downloaded.
                  console.log("Finished downloading.")
                  resolve();
                })
                .pipe(unzip.Parse())
                .on('entry', entry => {
                  const file = bucket.file(`${remoteDir}/${entry.path}`)

                  entry.pipe(file.createWriteStream())
                  .on('error', err => {
                    console.log(err)
                    reject(err);
                  })
                  .on('finish', () => {
                    console.log(`Finsihed extracting ${remoteDir}/${entry.path}`)
                  });

                  entry.autodrain();

                });
            }
        })

    });
stkvtflw
  • 12,092
  • 26
  • 78
  • 155
Aeyrium
  • 261
  • 2
  • 4
  • 4
    Thank you for your example, however I have found a serious flaw with it: You should only call entry.autodrain() if you don't consume the stream. Otherwise you will have corrupted output files. I did, until I changed the code. – Anders Emil Mar 06 '19 at 14:46
  • I can add that you should replace firebase.storage.bucket(object.bucket) with admin.storage.bucket() as firebase is not defined – Arkady Dec 23 '19 at 19:40
12

If you ended up having a zip file on your Google Cloud Storage bucket because you had to move large files from another server with the gsutil cp command, you could instead gzip when copying and it will be transferred in compressed format and unzippet when arriving to the bucket.

It is built in gsutil cp by using the -Z argument.

E.g.

gsutil cp -Z largefile.txt gs://bucket/largefile.txt
thephper
  • 2,342
  • 22
  • 21
  • 1
    If I understand OP's question correctly, they are asking for a tool that will upload a zipped file and then unzip it in the bucket. `gsutil cp -Z` doesn't do that. It zips the file and leaves it zipped in the bucket. Source: https://cloud.google.com/storage/docs/gsutil/commands/cp#synchronizing-over-os-specific-file-types-such-as-symlinks-and-devices – urig Mar 13 '22 at 09:03
  • 1
    Confirmed that this does not address the original question and it does not do the behavior described in the answer. – jwan Mar 25 '22 at 18:23
11

In shell, you can use the below command to unzip a compressed file

gsutil cat gs://bucket/obj.csv.gz | zcat |  gsutil cp - gs://bucket/obj.csv
Dishant Mishra
  • 119
  • 1
  • 4
  • Hi @Dishant, how can i use **zcat** in Google Cloud SDK tool? `'zcat' is not recognized as an internal or external command` – Abdurrahman I. Aug 28 '19 at 08:34
  • @AbdurrahmanI.As of now there no functionality available in Google Cloud SDK to unzip a file. The easiest way I found, is to use this shell command – Dishant Mishra Aug 28 '19 at 13:40
  • @DishantMishra If more than one files present inside zip file then How we can unzip all(we are giving one name-file.csv in destination location) like: gsutil cat gs://bucket/csv_files.zip | unzip | gsutil cp - gs://bucket/one_file.csv. It works but we can't see multiple files. which is present inside csv_files.zip – Bhagesh Arora Sep 05 '19 at 13:19
  • @BhageshArora The command will not work for the multiple files zipped together as it is redirecting the output to a single file. You may have to download the file, unzip it and copy the extracted files to GCS. You can use the below command (if JDK is installed) to directly download and unzip the files. `gsutil cat gs://bucket/obj.zip | jar xvf /dev/stdin ` – Dishant Mishra Sep 13 '19 at 09:20
  • I struggled with this for making multiple process and found a simple solution. You might interest. `gsutil cat gs://bucket/*/*/*/*.gz | zcat | gsutil cp - gs://bucket/decompressed/result.json` – Abdurrahman I. Jan 07 '20 at 12:59
  • this is streaming unzip,upload ? or first download the entire file first , then apply unzip finally upload back to gcs? – Ashika Umanga Umagiliya Dec 28 '22 at 03:01
8

There is no mechanism in the GCS to unzip the files. A feature request regarding the same has already been forwarded to the Google development team.

As an alternative, you can upload the ZIP files to the GCS bucket and then download them to a persistent disk attached to a VM instance, unzip them there, and upload the unzipped files using the gsutil tool.

Louie Bao
  • 1,632
  • 2
  • 16
  • 23
D Saini
  • 187
  • 7
8

There are Data flow templates in google Cloud data flow which helps to Zip/unzip the files in cloud storage.Refer below screenshots.

This template stages a batch pipeline that decompresses files on Cloud Storage to a specified location. This functionality is useful when you want to use compressed data to minimize network bandwidth costs. The pipeline automatically handles multiple compression modes during a single execution and determines the decompression mode to use based on the file extension (.bzip2, .deflate, .gz, .zip).

Pipeline requirements

The files to decompress must be in one of the following formats: Bzip2, Deflate, Gzip, Zip.

The output directory must exist prior to pipeline execution.

Vali7394
  • 441
  • 5
  • 10
3
  1. Enable Dataflow API in your gcloud console
  2. Create a temp dir in your bucket (cant use root).
  3. Replace YOUR_REGION (e.g. europe-west6) and YOUR_BUCKET in the below command and run it with gcloud cli (presumption is gz file is at root - change if not):
gcloud dataflow jobs run unzip \
--gcs-location gs://dataflow-templates-YOUR_REGION/latest/Bulk_Decompress_GCS_Files \
--region YOUR_REGION \
--num-workers 1 \
--staging-location gs://YOUR_BUCKET/temp \
--parameters inputFilePattern=gs://YOUR_BUCKET/*.gz,outputDirectory=gs://YOUR_BUCKET/,outputFailureFile=gs://YOUR_BUCKET/decomperror.txt
Sparm
  • 529
  • 1
  • 6
  • 14
2

I'm afraid that by default in Google Cloud no program could do this..., but you can have this functionality, for example, using Python.

Universal method available on any machine where Python is installed (so also on Google Cloud):

You need to enter the following commands:

python

or if you need administrator rights:

sudo python

and then in the Python Interpreter:

>>> from zipfile import ZipFile
>>> zip_file = ZipFile('path_to_file/t.zip', 'r')
>>> zip_file.extractall('path_to_extract_folder')

and finally, press Ctrl+D to exit the Python Interpreter.

The unpacked files will be located in the location you specify (of course, if you had the appropriate permissions for these locations).

The above method works identically for Python 2 and Python 3.

Enjoy it to the fullest! :)

simhumileco
  • 31,877
  • 16
  • 137
  • 115
  • I don't understand why this answer gets negative points. I tried to make it as useful as possible, correct and unique compared to others answers when added this one. If someone in the comment would write me what is wrong with it, I would be grateful. Thank you! – simhumileco Dec 12 '19 at 08:55
  • 2
    I tested your solution first as it seems easier than the accepted one. It failed for me at the second line when I specified a path gs://mybucket/myfolder/myfile.zip . I don't think ZipFile can access files from the bucket directly – Sylvain Gantois Feb 24 '20 at 21:03
  • Thank you for your comment @SylvainGantois. Indeed, my solution assumes that we have access to the zip package within the usual folder structure. I hope that it will be useful to some users at least, just as it was for me. – simhumileco Feb 25 '20 at 10:59
  • 8
    The point of the question was to unzip from and to Google Storage, not unzipping on a local file system. On a VM with files on a local file system you might as well just use unzip. – de1 Jun 02 '20 at 17:39
  • 1
    just to comment there is a package gcsfs which enables one to reference a gcs bucket file the same way you would the normal file system. – Ian Wesley May 19 '22 at 18:51
1

Another fast way to do it using Python in version 3.2 or higher:

import shutil
shutil.unpack_archive('filename')

The method also allows you to indicate the destination folder:

shutil.unpack_archive('filename', 'extract_dir')

The above method works not only for zip archives, but also for tar, gztar, bztar, or xztar archives.

If you need more options look into documentation of shutil module: shutil.unpack_archive

simhumileco
  • 31,877
  • 16
  • 137
  • 115
  • 1
    just tested it. works smoothly. not sure why it was getting negative votes. best solution in this thread according to me – Marc Jan 30 '21 at 13:32
  • @Marc Because it doesn't work for the google cloud files in the question. – YGao Nov 30 '22 at 21:16
1

You can create a Google Cloud Function with a Cloud Storage trigger.

When a new object is created, the function will be triggered.

const functions = require("@google-cloud/functions-framework");
const {Storage} = require("@google-cloud/storage");
const unzip = require("unzip-stream");

functions.cloudEvent("gcs-unzip", async cloudEvent => {

    //console.log(JSON.stringify(cloudEvent, null, 4));

    const zipFile = cloudEvent.data;

    //console.log(JSON.stringify(file, null, 4));

    if (zipFile.contentType === "application/zip") {
        const storage = new Storage();

        async function unzipAndUploadContainedFiles() {
            await storage
                .bucket(zipFile.bucket)
                .file(zipFile.name)
                .createReadStream()
                .pipe(unzip.Parse())
                .on("entry", async function (entry) { //there could be multiple files and even a directory structure in the zip file
                    //console.log(JSON.stringify(entry, null, 4));

                    const gcsTargetFileName = zipFile.name.replace(".zip", "") + "/" + entry.path;
                    if (entry.type === "File") {
                        entry.pipe(storage.bucket(zipFile.bucket).file(gcsTargetFileName).createWriteStream());
                    }
                });
        }

        await unzipAndUploadContainedFiles().catch(err => {
            console.error(err);
        });

    } else {
        console.log("Non-zip file ignored.");
    }

});
vpgcloud
  • 1,294
  • 2
  • 10
  • 19
0

To unzip all files within a zip I used this one-liner on the cloud shell terminal:

gsutil cat gs://{bucket_name}/data.zip | for i in $(jar --list); do gsutil cat gs://{bucket_name}/data.zip | jar x $i && cat $i | gsutil cp - gs://{bucket_name}/unzipped/$i && rm ./$i; done;