0

I'm managing a portal, built in the MERN stack, in which users can independently upload new audio / video contents created. I decided to use Google Cloud Storage as a service to store user content. (I know you can also use Firebase, at the moment I prefer to use "only" Cloud Storage).

In the backend, through express, I created a POST / usercontent endpoint and through multer I receive the file and then save it on Google Cloud.

Everything works perfectly, but I can't understand if I'm working correctly: I would like the file to be saved DIRECTLY on GCloud, without saving it, not even temporarily, in the backend. This is because content may be large and I may have memory problems.

I understand that I need to use streams, piping and PassThrough, however my knowledge of this topic is limited at the moment. I have read these answers and examples carefully but I just can't figure out how to work properly while also incorporating multer.

Sources:

https://github.com/googleapis/nodejs-storage/blob/main/samples/streamFileUpload.js Unable to Pipe File Read Stream from Google Cloud Storage to Google Drive API https://cloud.google.com/storage/docs/streaming#code-samples How to upload an in memory file data to google cloud storage using nodejs?

My code:

import Multer from 'multer';
import { Storage } from '@google-cloud/storage';

// config multer
const multerStorage = Multer({
  storage: Multer.memoryStorage(),
  limits: { fileSize: 10*1024*1024 }
});

// set endpoint
router.post('/upload', multerStorage.single('uploadedFile'), async (req, res, next) => {

  // creates a client from a Google service account key
  const storage = new Storage({
    projectId: 'xxx',
    keyFilename: 'yyy'
  });

  // access bucket
  const bucket = storage.bucket('bucketName');

  // create new file inside bucket
  const newFile = bucket.file(req.file.originalname);

  // STREAM ***HERE MY DOUBTS! ***
  const fileStream = newFile.createWriteStream({
    metadata: {
      contentType: req.file..mimetype
    },
    resumable: false
  });

  fileStream.write(req.file.buffer);

  fileStream.on('error', err => {
    console.error('error', err)
  });

  fileStream.on('finish', () => {
    console.log('finish!');
  });

  fileStream.end();

  // 7 response
  res.status(200).json({ 
    status: res.statusCode,
    message: 'Upload OK'
  });

});

Can anyone help me implement this code to directly send the stream to Google Cloud Storage?

Gourav B
  • 864
  • 5
  • 17
crivella
  • 514
  • 5
  • 19
  • It should work. ***I would like the file to be saved DIRECTLY on GCloud, without saving it, not even temporarily, in the backend.***- the whole purpose of Streaming transfer is to stream data to and from your Cloud Storage account without requiring that the data first be saved to a file as mentioned in [this document](https://cloud.google.com/storage/docs/streaming). – Prabir Nov 22 '21 at 09:59
  • Have you tried the code? Are you getting any errors? – Prabir Nov 22 '21 at 10:03
  • @Prabir my code works perfectly, as I indicated in the question. The thing is, I have no idea how to test and verify that the file is transferred directly without first being read in its entirety: I have read several times that multer reads and loads the entire file, then transfers it to the stream, but I just don't know how I can verify this operation. Some idea? – crivella Nov 23 '21 at 08:16
  • I have posted an answer. Can you check if it is helpful? – Prabir Dec 03 '21 at 03:37

1 Answers1

1

The purpose of Streaming transfer is to stream data to and from your Cloud Storage account without requiring that the data first be saved to a file as mentioned in this document.

The method createWriteStream, which you are concerned about, is documented here. You can control the stream by using the options available here.

For resumable upload, files can get stored locally if the directory is writeable as stated in this document.

Resumable uploads are automatically enabled and must be shut off explicitly by setting options.resumable to false.

Resumable uploads require write access to the $HOME directory. Through config-store, some metadata is stored. By default, if the directory is not writable, we will fall back to a simple upload. However, if you explicitly request a resumable upload, and we cannot write to the config directory, we will return a ResumableUploadError.

In your case as you have stated the options.resumable to false, it should be a simple upload and should not store the file locally.

To check if it is getting stored in local storage, you can run the code in scope that prevents binary writing to disk. Alternatively you can monitor the disk usage during the upload operation.

Prabir
  • 1,415
  • 4
  • 10