5

I'm trying to invoke a google cloud function sending images larger than 50Mb. The purpose of the cloud function is to resize the images and upload them to google cloud storage.

However, when I send the HTTP post to my cloud function I get the following error: 413 Request Entity Too Large

Does anyone have any workaround to this error? Can I increase the http request size limit?

Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
João Rulff
  • 51
  • 1
  • 5

3 Answers3

10

The limit for HTTP trigger upload and download payload size is documented at 10MB. There is no way to get this limit increased, but you can always file a feature request explaining why it should be increased.

Doug Stevenson
  • 297,357
  • 32
  • 422
  • 441
  • Hey Doug, What about compressing the data? are there any case studies on this or examples of compressing data for http responses from CF or http requests from clients? I was considering going the compression route but decided to just put the data inside of GCS then access through Firebase from the client (fit my use case pretty well). – insta catering Oct 14 '20 at 16:44
1

You can let the client upload directly to storage. authinticated onto his own user folder and security rules limiting the file size to whatever size you wish into a temp folder.

Then have a cloud function trigger started resizing the image. And Delete the original image when finished.

I'm attaching a code example of mine - you should add the a delete of the file after conversion...

/**
 * When an image is uploaded in the Storage bucket We generate a thumbnail automatically using
 * ImageMagick.
 * After the thumbnail has been generated and uploaded to Cloud Storage,
 * we write the public URL to the Firebase Realtime Database.
 */
exports.generateThumbnail = functions.storage.object().onFinalize((object) => {
  console.log('Generated Started');

  // File and directory paths.
  const filePath = object.name;
  const contentType = object.contentType; // This is the image MIME type
  const fileDir = path.dirname(filePath);
  const fileName = path.basename(filePath);
  const thumbFilePath = path.normalize(path.join(fileDir, `${THUMB_PREFIX}${fileName}`));
  const tempLocalFile = path.join(os.tmpdir(), filePath);
  const tempLocalDir = path.dirname(tempLocalFile);
  const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);

  // Exit if this is triggered on a file that is not an image.
  if (!contentType.startsWith('image/')) {
    console.log('This is not an image.');
    deleteImage(filename);
    return null;
  }

  // Exit if the image is already a thumbnail.
  if (fileName.startsWith(THUMB_PREFIX)) {
    console.log('Already a Thumbnail.');
    deleteImage(filename);
    return null;
  }

  // Cloud Storage files.
  const bucket = gcs.bucket(object.bucket);
  const file = bucket.file(filePath);
  const thumbFile = bucket.file(thumbFilePath);
  const metadata = {
    contentType: contentType,
    // To enable Client-side caching you can set the Cache-Control headers here. Uncomment below.
    'Cache-Control': 'public,max-age=3600',
  };
  // Create the temp directory where the storage file will be downloaded.
  return mkdirp(tempLocalDir).then(() => {
    console.log('DL Started');

    // Download file from bucket.
    return file.download({
      destination: tempLocalFile
    });
  }).then(() => {
    console.log('The file has been downloaded to', tempLocalFile);
    // Generate a thumbnail using ImageMagick.
    return spawn('convert', [tempLocalFile, '-thumbnail', `${THUMB_MAX_WIDTH}x${THUMB_MAX_HEIGHT}>`, tempLocalThumbFile], {
      capture: ['stdout', 'stderr']
    });
  }).then(() => {
    console.log('Thumbnail created at', tempLocalThumbFile);
    // Uploading the Thumbnail.
    return bucket.upload(tempLocalThumbFile, {
      destination: thumbFilePath,
      metadata: metadata
    });
  }).then(() => {
    console.log('Thumbnail uploaded to Storage at', thumbFilePath);
    // Once the image has been uploaded delete the local files to free up disk space.
    fs.unlinkSync(tempLocalFile);
    fs.unlinkSync(tempLocalThumbFile);
    // Get the Signed URLs for the thumbnail and original image.
    const config = {
      action: 'read',
      expires: '03-01-2500',
    };
    return Promise.all([
      thumbFile.getSignedUrl(config),
      // file.getSignedUrl(config),
    ]);
  }).then((results) => {
    console.log('Got Signed URLs.');
    const thumbResult = results[0];
    // const originalResult = results[1];
    const thumbFileUrl = thumbResult[0];
    // const fileUrl = originalResult[0];
    // Add the URLs to the Database
    const uid = getUidFromFilePath(fileDir);
    if (!uid) return null;

    return Promise.all([
      admin.auth().updateUser(uid, {
        photoURL: thumbFileUrl
      }),
      admin.database().ref(`/users/${uid}/profile/photoURL`).set(thumbFileUrl)
    ]);
  }).then(() => console.log('Thumbnail URLs saved to database.'));
});
jBOB
  • 41
  • 5
0

As of 2022, the quota limit for the second generation of cloud functions is 32MB.

jackstruck
  • 11
  • 2