1

I have gzip files in my Google Cloud Storage and I have to check the checksum of the file that is gzipped inside with Cloud Functions.

I started to work with the unzip example from this example, but it only works with ZIP files not gzip:

gcsSrcObject.createReadStream()
    .pipe(unzipper.Parse())
    .pipe(stream.Transform({
        objectMode: true,
        transform: function (entry, e, callback) {
            var filePath = entry.path;
            var type = entry.type;
            var size = entry.size;
            console.log(`Found ${type}: ${filePath}`);
            console.log(`Unzipping to: ${TEMP}/${prefix}/${filePath}`)
            var gcsDstObject = dstBucket.file(`${TEMP}/${prefix}/${filePath}`);
            entry
                .pipe(gcsDstObject.createWriteStream())
                .on('error', function (err) {
                    console.log(`Error: ${err}`);
                })
                .on('finish', function () {
                    console.log('Complete');
                    callback();
                });
        }
    }));

I have also read the documentation about the native storage functions(gsutil cp), but it only let you to GZIP files in copy from local.

Ahmed Ashour
  • 5,179
  • 10
  • 35
  • 56
Viento
  • 109
  • 2
  • 12
  • [Here](https://stackoverflow.com/questions/12148948/how-do-i-ungzip-decompress-a-nodejs-requests-module-gzip-response-body) is an example using zlib. Does it answer your question? – A.Queue Jul 12 '18 at 09:57

1 Answers1

1

Remember that if your files are stored in GCS with "Content-Encoding: gzip" then you can decompress them on the fly transparently. GCS calls this "transcoding"

https://cloud.google.com/storage/docs/transcoding

pantulis
  • 1,695
  • 12
  • 15
  • This is a good departure point, but I'm afraid that the operations I need over those files (for example gsutil hash) doesn't seems to work with the ungzipped file. All values that returns are related to the Gzip file instead the uncompressed one. – Viento Jul 03 '18 at 13:28