2

What's the proper way to return stream from Firebase callable function? I want it to behave like a proxy and return the response stream directly. Currently, I wait for response and append them like shown below. I don't think it's the right way to do it as it won't return stream directly.

const axios = require('axios');
const functions = require("firebase-functions");
const {PassThrough} = require("stream");
exports.create = functions.https.onCall(async(data, context) => {

    const options = {
        method: 'POST',
        url: '...someurl',
       ,
        responseType: 'stream'
      };

    const response =  await axios(options)

    const chunks = response.data.pipe(new PassThrough({encoding:'base64'}));

    // then we use an async generator to read the chunks
    let str = '';
    for await (let chunk of chunks) {
        str += chunk;
    }
    return str;
}
Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
Ozgur Sahin
  • 1,305
  • 16
  • 24
  • 3
    Cloud functions are terminated when you return a promise. Is it required for you to stream data? It might be better to use [Google Compute Engine](https://cloud.google.com/compute) instead of CFs in that case. – Dharmaraj Sep 06 '22 at 11:56
  • Cloud functions seems easier as it offers built-in control like AppCheck but I will look into at too. Thank you! – Ozgur Sahin Sep 06 '22 at 14:44
  • I achieved this through firebase v2 functions, however, seems like it is not possible with onCall function. In firebase http functino we have response as parameter which is working absolutely fine but it is hard to check it's authentication. – Shajeel Afzal Jan 31 '23 at 14:30

1 Answers1

2

If you have access to cloud storage buckets you can upload the data to a bucket, generate a download URL and redirect your HTTP request to the download link

//using admin sdk and express
import * as admin from "firebase-admin";
const bucket = admin.storage().bucket('<bucket name>');
await bucket.file('<filename>').save(data).catch();

bucket.file('<filename>').getSignedURL({
   action: 'read',
   expires: Date.now() + 600000, //expires in 10 minutes
}).then(url => {
   response.redirect(url[0]); //redirect to download URL
}).catch(error => {
   console.log(error);
})

You will have to change your storage permissions settings to authenticate the download URL creation and access. You can do this in the rules section of the storage project category on firebase

service firebase.storage {
  match /b/{bucket}/o {
    match /<bucket name>/{allPaths=**} {
      allow read: if true;
      allow write: if request.auth != null; //all other locations are secured by auth
    }
    match /{allPaths=**} {
      allow read, write: if request.auth != null; 
    }
  }
}

Ensure you replace <bucket name> with the name of your bucket. This rule allows anyone to read from the databucket but not write to it unless authorized. All other locations are secure. If you are storing confidential information do not use this permissions workaround and secure the download URL creation by correct auth.

Joe Moore
  • 2,031
  • 2
  • 8
  • 29
  • 1
    Thank you, I think this could solve my problem too. I don't want to increase the costs but I guess I can store files on bucket temporarily and hopefully make them auto-removed after some duration like a minute with some extra settings. – Ozgur Sahin Sep 06 '22 at 15:30
  • 1
    No worries. Feel free to ask if you get stuck. You may benefit from having the admin sdk installed: `npm i firebase-admin`. You are able to delete the file after downloading it, thus making the download link redundant and having the effect of only storing 1 file in the bucket at a time. That would keep costs as low as feasibly possible with this solution. – Joe Moore Sep 06 '22 at 15:40