38

Per the Firebase Cloud Functions documentation, you can leverage ImageMagick from within a cloud function: https://firebase.google.com/docs/functions/use-cases

Is it possible do something similar but call out to FFMPEG instead of ImageMagick? While thumbnailing images is great, I'd also like the capability to append incoming images to a video file stored out on Firebase Storage.

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Dave Parks
  • 403
  • 1
  • 4
  • 4
  • 4
    Bear in mind that you have a limited about of temp disk space and memory to work with. In fact, temp disk *is* stored in memory, so if you have a large video, you could easily run out of memory. – Doug Stevenson Mar 14 '17 at 00:16
  • 1
    Note though, that this is the memory is quite a lot for many operations as of`8,5GB per functions` as written [here](https://firebase.google.com/docs/functions/quotas) – Paul Jan 05 '22 at 10:35

7 Answers7

72

Update: ffmpeg is now preinstalled in the Cloud Functions environment. For a complete list of preinstalled packages, check out https://cloud.google.com/functions/docs/reference/system-packages.

Note As of April 2023, google does not offer ffmpeg as a pre-installed package for cloud functions with the latest version of Ubuntu (v22.04). So make sure you pick a runtime environment that uses Ubuntu (v18.04) in order to have ffmpeg pre-installed on your cloud function. You can find a complete list of runtime environments that use Ubuntu (v18.04) available here.

Note: you only have disk write access at /tmp/.

Option 1: use ffmpeg-fluent npm module

This module abstracts the ffmpeg command line options with an easy to use Node.js module.

const ffmpeg = require('fluent-ffmpeg');

let cmd = ffmpeg('example.mp4')
    .clone()
    .size('300x300')
    .save('/tmp/smaller-file.mp4')
    .on('end', () => {
      // Finished processing the video.
      console.log('Done');

      // E.g. return the resized video:
      res.sendFile('/tmp/smaller-file.mp4');
    });

Full code on GitHub

Option 2: invoke the ffmpeg binary directly

Because ffmpeg is already installed, you can invoke the binary and its command line options via a shell process.

const { exec } = require("child_process");

exec("ffmpeg -i example.mp4", (error, stdout, stderr) => {
  //ffmpeg logs to stderr, but typically output is in stdout.
  console.log(stderr);
});

Full code on GitHub

Option 3: upload your own binary

If you need a specific version of ffmpeg, you can include an ffmpeg binary as part of the upload and then run a shell command using something like child_process.exec. You'll need an ffmpeg binary that's compiled for the target platform (Ubuntu).

File listing with pre-compiled ffmpeg binary

./
../
index.js
ffmpeg

index.js

const { exec } = require("child_process");

exec("ffmpeg -i example.mp4", (error, stdout, stderr) => {
  //ffmpeg logs to stderr, but typically output is in stdout.
  console.log(stderr);
});

I've included two full working examples on GitHub. The examples are for Google Cloud Functions (not specifically Cloud Functions for Firebase).

Llama D'Attore
  • 323
  • 1
  • 12
Bret McGowen
  • 1,270
  • 11
  • 8
  • How do I refer to a file in cloud firebase function, I mean what is the path to write in the FFMPEG command to refer to a file in firebase storage? – Omar HossamEldin Dec 04 '17 at 22:24
  • 1
    @OmarHossamEldin everything you upload as part of your function is stored in the `/user_code/` directory on the server. – Bret McGowen Apr 06 '18 at 23:31
  • This was a life saver - thank you!! It was such a help having the refs the Github pages. Just a side note though, I am using `fluent-ffmpeg` npm package, and needed to add the ffmpeg path to the binary in the Docker file as an ENV variable: `ENV PATH="/usr/src/app/node_modules/ffmpeg-static/bin/linux/x64:${PATH}"` – Peza Jun 05 '18 at 11:02
  • 4
    `ffmpeg` is now included in the Cloud Functions environment – wabisabit Jul 28 '20 at 06:48
15

While you technically can run FFMPEG on a Firebase Functions instance, you will quickly hit the small quota limits.

As per this answer, you can instead use Functions to trigger a request to GCP's more powerful App Engine or Compute Engine services. The App Engine process can grab the file from the same bucket, handle the transcoding, and upload the finished file back to the bucket. If you check the other answers at the link, one user posted a sample repo that does just that.

ajabeckett
  • 206
  • 2
  • 4
  • Looks like the [current limit](https://firebase.google.com/docs/functions/quotas#time_limits) for "event-driven" functions (such as Firebase storage triggers and Firestore triggers) is 10 minutes. You can do a lot with `ffmpeg` in 10 minutes! – Charles Holbrow Nov 21 '22 at 01:15
13

ffmpeg is now included in the Cloud Functions environment so it can be used directly:

spawn(
  'ffmpeg',
  ['-i', 'video.mp4'] 
)

Full list of installed packages: https://cloud.google.com/functions/docs/reference/nodejs-system-packages

wabisabit
  • 412
  • 5
  • 16
  • the version on the runtime environment was too old, it didn't even support the var_stream_map flag - I moved the heavy lifting to App Engine – danday74 Oct 23 '21 at 02:21
8

Use the lib https://github.com/eugeneware/ffmpeg-static

const ffmpeg = require('fluent-ffmpeg');
const ffmpeg_static = require('ffmpeg-static');


let cmd = ffmpeg.('filePath.mp4')
   .setFfmpegPath(ffmpeg_static.path)
   .setInputFormat('mp4')
   .output('outputPath.mp4')
   ...
   ...
   .run()
Daniel Lessa
  • 367
  • 3
  • 6
  • 1
    see: https://github.com/firebase/functions-samples/tree/master/ffmpeg-convert-audio – Henry Jan 16 '18 at 08:09
  • Note that current version of ffmpeg-static is returns directly the path, so you need to call `.setFfmpegPath(ffmpeg_static)` directly, without the `.path` – Vojtěch Jun 16 '20 at 21:22
  • This answer lacks detail. ffmpeg-static is great. You just add it as a dependency and it auto installs a suitable version of ffmpeg when someone (or the system) does an npm install. You can get the path to the ffmpeg bin with ... const pathToFfmpeg = require('ffmpeg-static') ... and then use that path when executing commands with, for example, Node's execSync function - however, I'm not sure how well it will work on a Cloud Runtime environment and the quotas issue is likely to still be a problem - best stick to using AppEngine - but +1 for a good idea – danday74 Oct 23 '21 at 02:18
2
/**
 * Copyright 2017 Google Inc. All Rights Reserved.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 *      http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for t`he specific language governing permissions and
 * limitations under the License.
 */
'use strict';

const functions = require('firebase-functions');
const gcs = require('@google-cloud/storage')();
const path = require('path');
const os = require('os');
const fs = require('fs');
const ffmpeg = require('fluent-ffmpeg');
const ffmpeg_static = require('ffmpeg-static');

/**
 * When an audio is uploaded in the Storage bucket We generate a mono channel audio automatically using
 * node-fluent-ffmpeg.
 */
exports.generateMonoAudio = functions.storage.object().onChange(event => {
  const object = event.data; // The Storage object.

  const fileBucket = object.bucket; // The Storage bucket that contains the file.
  const filePath = object.name; // File path in the bucket.
  const contentType = object.contentType; // File content type.
  const resourceState = object.resourceState; // The resourceState is 'exists' or 'not_exists' (for file/folder deletions).
  const metageneration = object.metageneration; // Number of times metadata has been generated. New objects have a value of 1.

  // Exit if this is triggered on a file that is not an audio.
  if (!contentType.startsWith('audio/')) {
    console.log('This is not an audio.');
    return;
  }

  // Get the file name.
  const fileName = path.basename(filePath);
  // Exit if the audio is already converted.
  if (fileName.endsWith('_output.flac')) {
    console.log('Already a converted audio.');
    return;
  }

  // Exit if this is a move or deletion event.
  if (resourceState === 'not_exists') {
    console.log('This is a deletion event.');
    return;
  }

  // Exit if file exists but is not new and is only being triggered
  // because of a metadata change.
  if (resourceState === 'exists' && metageneration > 1) {
    console.log('This is a metadata change event.');
    return;
  }

  // Download file from bucket.
  const bucket = gcs.bucket(fileBucket);
  const tempFilePath = path.join(os.tmpdir(), fileName);
  // We add a '_output.flac' suffix to target audio file name. That's where we'll upload the converted audio.
  const targetTempFileName = fileName.replace(/\.[^/.]+$/, "") + '_output.flac';
  const targetTempFilePath = path.join(os.tmpdir(), targetTempFileName);
  const targetStorageFilePath = path.join(path.dirname(filePath), targetTempFileName);

  return bucket.file(filePath).download({
    destination: tempFilePath
  }).then(() => {
    console.log('Audio downloaded locally to', tempFilePath);
    // Convert the audio to mono channel using FFMPEG.
    const command = ffmpeg(tempFilePath)
      .setFfmpegPath(ffmpeg_static.path)    
      .audioChannels(1)
      .audioFrequency(16000)
      .format('flac')
      .on('error', (err) => {
        console.log('An error occurred: ' + err.message);
      })
      .on('end', () => {
        console.log('Output audio created at', targetTempFilePath);

        // Uploading the audio.
        return bucket.upload(targetTempFilePath, {destination: targetStorageFilePath}).then(() => {
          console.log('Output audio uploaded to', targetStorageFilePath);

          // Once the audio has been uploaded delete the local file to free up disk space.     
          fs.unlinkSync(tempFilePath);
          fs.unlinkSync(targetTempFilePath);

          console.log('Temporary files removed.', targetTempFilePath);
        });
      })
      .save(targetTempFilePath);
  });
});

https://github.com/firebase/functions-samples/blob/master/ffmpeg-convert-audio/functions/index.js

Henry
  • 32,689
  • 19
  • 120
  • 221
  • Note that current version of ffmpeg-static is returns directly the path, so you need to call `.setFfmpegPath(ffmpeg_static)` directly, without the `.path` – Vojtěch Jun 16 '20 at 21:22
2

Practically, no. FFMPEG processes audio/video files which typically exceed the Cloud Functions quotas (10MB uploads).

You would need to run Node.js on GCP's AppEngine.

Ronnie Royston
  • 16,778
  • 6
  • 77
  • 91
  • Thanks, just wondering if you've seen any repos with code templates that you could build upon for such tasks? Specifically how you would trigger the workflow in GAE? – Igniter Oct 16 '19 at 03:01
  • you use functions to trigger the workflow and they make requests to GAE to do the heavy lifting - and similarly when the heavy lifting is done - GAE can make requests to HTTP functions - or use PubSub for comms since functions also can handle PubSub – danday74 Oct 23 '21 at 02:24
0

The other answers that suggest App Engine are correct. But the missing info is what is App Engine?

It's basically the heavy lifter. It allows you to write a backend and deploy it to the cloud. Think of a Node express server you might typically develop. Then deploy it to the cloud. That's App Engine.

Firebase / Cloud Functions talk to App Engine typically over HTTP or via PubSub.

Functions are meant for lightweight work. They tell you when an event has happened (e.g. file uploaded to storage bucket) and the "event" that was fired has a payload detailing info about the event (e.g. details of the object uploaded to the bucket).

When that event happens, if heavy work is needed (or if required software on the Node.js runtime environment is lacking), the function makes, for example, an HTTP request to App Engine, providing the info App Engine needs to do the necessary processing.

App Engine is flexible. You define a yaml file and optionally a Dockerfile.

Here's an example:

runtime: custom # custom means it uses a Dockerfile
env: flex

manual_scaling:
  instances: 1
resources:
  cpu: 1
  memory_gb: 0.5
  disk_size_gb: 10

Here you define CPU count, memory, disk size, etc. Unlike functions, the disk is writable (I am led to believe, I am still in the process of integration).

Via the Dockerfile you can define exactly what software you want installed. If you are not familiar with Dockerfile's, here's a nice example.

https://nodejs.org/en/docs/guides/nodejs-docker-webapp

You develop locally and then when done, you deploy to the cloud with:

gcloud app deploy

And voila, your app appears in the cloud. The gcloud command comes with the Google Cloud SDK.

Note that AppEngine can talk back to functions via HTTP functions or PubSub when processing is complete.

Much love in Him :D

danday74
  • 52,471
  • 49
  • 232
  • 283