21

I am reading an image from a url and processing it. I need to upload this data to a file in cloud storage, currently i am writing the data to a file and uploading this file and then deleting this file. Is there a way i can upload the data directly to the cloud stoage?

static async uploadDataToCloudStorage(rc : RunContextServer, bucket : string, path : string, data : any, mimeVal : string | false) : Promise<string> {
  if(!mimeVal) return ''

  const extension = mime.extension(mimeVal),
        filename  = await this.getFileName(rc, bucket, extension, path),
        modPath   = (path) ? (path + '/') : '',
        res       = await fs.writeFileSync(`/tmp/${filename}.${extension}`, data, 'binary'),
        fileUrl   = await this.upload(rc, bucket, 
                            `/tmp/${filename}.${extension}`,
                            `${modPath}${filename}.${extension}`)
                   
  await fs.unlinkSync(`/tmp/${filename}.${extension}`)

  return fileUrl
}

static async upload(rc : RunContextServer, bucketName: string, filePath : string, destination : string) : Promise<string> {
  const bucket : any = cloudStorage.bucket(bucketName),
        data   : any = await bucket.upload(filePath, {destination})

  return data[0].metadata.name
}
Akash Dathan
  • 4,348
  • 2
  • 24
  • 45

4 Answers4

18

Yes, it's possible to retrieve an image from a URL, perform edits to the image, and upload it to Google Cloud Storage (or Firebase storage) using nodejs, without ever saving the file locally.

This is building on Akash's answer with an entire function that worked for me, including the image manipulation step.

Steps

If you are a firebase user using firebase storage, you must still use this library. The firebase web implementation for storage does not work in node. If you created your storage in firebase, you can still access this all through Google Cloud Storage Console. They are the same thing.

const axios = require('axios');
const sharp = require('sharp');
const { Storage } = require('@google-cloud/storage');

const processImage = (imageUrl) => {
    return new Promise((resolve, reject) => {

        // Your Google Cloud Platform project ID
        const projectId = '<project-id>';

        // Creates a client
        const storage = new Storage({
            projectId: projectId,
        });

        // Configure axios to receive a response type of stream, and get a readableStream of the image from the specified URL
        axios({
            method:'get',
            url: imageUrl,
            responseType:'stream'
        })
        .then((response) => {

            // Create the image manipulation function
            var transformer = sharp()
            .resize(300)
            .jpeg();

            gcFile = storage.bucket('<bucket-path>').file('my-file.jpg')

            // Pipe the axios response data through the image transformer and to Google Cloud
            response.data
            .pipe(transformer)
            .pipe(gcFile.createWriteStream({
                resumable  : false,
                validation : false,
                contentType: "auto",
                metadata   : {
                    'Cache-Control': 'public, max-age=31536000'}
            }))
            .on('error', (error) => { 
                reject(error) 
            })
            .on('finish', () => { 
                resolve(true)
            });
        })
        .catch(err => {
            reject("Image transfer error. ", err);
        });
    })
}

processImage("<url-to-image>")
.then(res => {
  console.log("Complete.", res);
})
.catch(err => {
  console.log("Error", err);
});
Community
  • 1
  • 1
Matthew Rideout
  • 7,330
  • 2
  • 42
  • 61
  • 1
    This is the exact answer. Pipe the transform then pipe the write stream. You can grab the signedUrl right after this no problem as well. Thanks Matthew R – Emmett Harper Dec 26 '19 at 20:56
17

The data can be uploaded without writing to a file by using nodes streams.

const stream     = require('stream'),
      dataStream = new stream.PassThrough(),
      gcFile     = cloudStorage.bucket(bucketName).file(fileName)

dataStream.push('content-to-upload')
dataStream.push(null)

await new Promise((resolve, reject) => {
  dataStream.pipe(gcFile.createWriteStream({
    resumable  : false,
    validation : false,
    metadata   : {'Cache-Control': 'public, max-age=31536000'}
  }))
  .on('error', (error : Error) => { 
    reject(error) 
  })
  .on('finish', () => { 
    resolve(true)
  })
})
Akash Dathan
  • 4,348
  • 2
  • 24
  • 45
  • 1
    working pretty good....... thanks a lot. note: If any issue while uploading csv file you should check your contentType meta data. because i face the issue thats why i tell here. once again thank a lot – Krrish Nov 21 '19 at 09:03
2

This thread is old but in the current API, File object works with Streams

So you can have something like this to upload a JSON file from memory:

const { Readable } = require("stream")
const { Storage } = require('@google-cloud/storage');

const bucketName = '...';
const filePath = 'test_file_from_memory.json';
const storage = new Storage({
  projectId: '...',
  keyFilename: '...'
});
(() => {
  const json = {
    prop: 'one',
    att: 2
  };
  const file = storage.bucket(bucketName).file(filePath);
  Readable.from(JSON.stringify(json))
    .pipe(file.createWriteStream({
      metadata: {
        contentType: 'text/json'
      }
    }).on('error', (error) => {
      console.log('error', error)
    }).on('finish', () => {
      console.log('done');
    }));
})();

Source: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream

Ezequiel Alanis
  • 451
  • 5
  • 12
0

You can also upload multiple files:

@Post('upload')
@UseInterceptors(AnyFilesInterceptor())
uploadFile(@UploadedFiles())
    const storage = new Storage();
    for (const file of files) {
        const dataStream = new stream.PassThrough();
        const gcFile = storage.bucket('upload-lists').file(file.originalname)
        dataStream.push(file.buffer);
        dataStream.push(null);
        new Promise((resolve, reject) => {
            dataStream.pipe(gcFile.createWriteStream({
                resumable: false,
                validation: false,
                // Enable long-lived HTTP caching headers
                // Use only if the contents of the file will never change
                // (If the contents will change, use cacheControl: 'no-cache')
                metadata: { 'Cache-Control': 'public, max-age=31536000' }
            })).on('error', (error: Error) => {
                reject(error)
            }).on('finish', () => {
                resolve(true)
            })
        })
    }
Danaahm
  • 1
  • 1