3

I've tried the code below, but Google Cloud Function's limit for a response is only 10M, but I want to return larger files:

const csv = json2csv(onlyDataTransactions);
response.setHeader(
  "Content-disposition",
  "attachment; filename=transactions.csv"
);
response.set("Content-Type", "text/csv");
response.status(200).send(csv);

Updated: Thanks to @Andrew I have this first update on code, I force the compression because compression middleware on firebase cloud functions depends even on user-agent header, I still working o the other suggestions to find the best result, thanks to all.

if (request.headers['content-type'] === 'text/csv') {
    const onlyDataTransactions = transactions.map(transaction => transaction.toCsvRecord());
    const csv = parse(onlyDataTransactions);
    response.setHeader(
       "Content-disposition",
       "attachment; filename=transactions.csv"
    );
    response.set("Content-Type", "text/csv");
    response.set('Content-Encoding', 'gzip');
    const content = await gzip(JSON.stringify(csv));
    response.status(200).send(content);
}
  • Does this answer your question? [Cloud Function - Getting file contents more than 10 MB](https://stackoverflow.com/questions/56090400/cloud-function-getting-file-contents-more-than-10-mb) – Nakilon Nov 27 '21 at 01:09

3 Answers3

2

How large is the .csv that you are transferring?

Documentation:

Note: Cloud Functions limits HTTP request body sizes to 10MB, so any request larger than this will be rejected before your function is executed. We recommend uploading large files or files that persist beyond a single request directly to Cloud Storage

Based on this, I would introduce some compression of the .csv by adding Content-Encoding to convert files to and from a gzip-compressed state. You should get a good rate of compression if it is a .csv file.

Content-Type: text/plain
Content-Encoding: gzip

If you are using Google Cloud Storage (GCS), you can read more about transcoding in the GCS documentation.

gzip is a form of data compression: it typically reduces the size of a file. This allows the file to be transferred faster and stored using less space than if it were not compressed. Compressing a file can reduce both cost and transfer time. Transcoding, in Cloud Storage, is the automatic changing of a file's compression before it's served to a requester. When transcoding results in a file becoming gzip-compressed, it can be considered compressive, whereas when the result is a file that is no longer gzip-compressed, it can be considered decompressive. Cloud Storage supports the decompressive form of transcoding.

There is also an excellent resource on the Mozilla site that is worth reading about setting the header to compress and decompress media.

If this does not work for you, then your only real option is to either chunk the data somehow or compress the file while it is still in the GCS bucket. I have had success with gzip compression levels in the past to reduce the file size. The compression could be done with another Cloud Function if needed.

Andrew
  • 795
  • 4
  • 18
  • Thank you, I didn't think about compression, that's a good idea, I will do that. The file size I guess will be at least 30MB because is a transaction historic file, I would like to let the user to download several months of transactions – Jesus David Mondragón Cirio Jan 21 '21 at 17:12
  • I have updated my answer as i think the other option is to compress the data while on the storage before it is downloaded. – Andrew Jan 22 '21 at 10:05
  • Hi Andrew, I updated the question, I use node-gzip to compress, because compression middleware on firebase cloud-function environment depends on request data even the user-agent (https://stackoverflow.com/questions/48309760/how-to-send-gzipped-json-response-from-google-cloud-functions) and I want to force compression – Jesus David Mondragón Cirio Jan 25 '21 at 18:35
1

Google Cloud Functions isn't compliant with streaming response. I recommend you to have a look to Cloud Run (it's very easy to wrap a function in Cloud Run) and thus to use the streaming capacity of Cloud Run.

guillaume blaquiere
  • 66,369
  • 2
  • 47
  • 76
0

Try an approach with a stream, pipe(). I hope it will solve the issue.

Look at this repo, it will help you to solve the issue.

https://github.com/dipbd1/ts-unicode-stream-throttle