1

In my application I read huge data of images, and send the whole data to the client:

const imagesPaths = await getFolderImagesRecursive(req.body.rootPath);
const dataToReturn = await Promise.all(imagesPaths.map((imagePath) => new Promise(async (resolve, reject) => {
    try {
        const imageB64 = await fs.readFile(imagePath, 'base64');

        return resolve({
            filename: imagePath,
            imageData: imageB64,
        });
    } catch {
        return reject();
    }
})));

return res.status(200).send({
    success: true,
    message: 'Successfully retreived folder images data',
    data: dataToReturn,
});

Here is the client side:

const getFolderImages = (rootPath) => {
    return fetch('api/getFolderImages', {
        method: 'POST',
        headers: { 'Content-type': 'application/json' },
        body: JSON.stringify({ rootPath }),
    });
};

const getFolderImagesServerResponse = await getFolderImages(rootPath);
const getFolderImagesServerData = await getFolderImagesServerResponse.json();

When I do send the data I get failure due to the huge data. Sending the data just with res.send(<data>) is impossible. So, then, how can I bypass this limitation - and how should I accept the data in the client side with the new process?

T THE R
  • 195
  • 1
  • 13
  • Does this answer your question? [NodeJS/ExpressJS send response of large amount of data in 1 stream](https://stackoverflow.com/questions/17622265/nodejs-expressjs-send-response-of-large-amount-of-data-in-1-stream) –  Feb 10 '21 at 12:38
  • @AlphaMirage Only partly with backend side. does not answer how I could accept the data in client side. – T THE R Feb 10 '21 at 12:41
  • Maybe try looking at this question. It doesn't have an answer but it includes something that you are missing. Try setting some headers: [NodeJs: How to send very large data from server to client](https://stackoverflow.com/questions/45779883/nodejs-how-to-send-very-large-data-from-server-to-client) –  Feb 10 '21 at 12:44

1 Answers1

1

The answer to your problem requires some read :

Link to the solution

One thing you probably haven’t taken full advantage of before is that webserver’s http response is a stream by default.

They just make it easier for you to pass in synchron data, which is parsed to chunks under the hood and sent as HTTP packages.

We are talking about huge files here; naturally, we don’t want them to be stored in any memory, at least not the whole blob. The excellent solution for this dilemma is a stream.

We create a readstream with the help of the built-in node package ‘fs,’ then pass it to the stream compatible response.send parameter.

const readStream = fs.createReadStream('example.png');
return response.headers({
  'Content-Type': 'image/png',
  'Content-Disposition': 'attachment; filename="example.png"',
}).send(readStream);

I used Fastify webserver here, but it should work similarly with Koa or Express.

There are two more configurations here: naming the header ‘Content-Type’ and ‘Content-Disposition.’

The first one indicates the type of blob we are sending chunk-by-chunk, so the frontend will automatically give the extension to it.

The latter tells the browser that we are sending an attachment, not something renderable, like an HTML page or a script. This will trigger the browser’s download functionality, which is widely supported. The filename parameter is the download name of the content.

Here we are; we accomplished minimal memory stress, minimal coding, and minimal error opportunities.

One thing we haven’t mentioned yet is authentication.

For the fact, that the frontend won’t send an Ajax request, we can’t expect auth JWT header to be present on the request.

Here we will take the good old cookie auth approach. Cookies are set automatically on every request header that matches the criteria, based on the cookie options. More info about this in the frontend implementation part.

By default, cookies arrive as semicolon separated key-value pairs, in a single string. In order to ease out the parsing part, we will use Fastify’s Cookieparser plugin.

await fastifyServer.register(cookieParser); Later in the handler method, we simply get the cookie that we are interested in and compare it to the expected value. Here I used only strings as auth-tokens; this should be replaced with some sort of hashing and comparing algorithm.

const cookies = request.cookies;
if (cookies['auth'] !== 'authenticated') {
   throw new APIError(400, 'Unauthorized');
}

That’s it. We have authentication on top of the file streaming endpoint, and everything is ready to be connected by the frontend.

ADITYA AHLAWAT
  • 140
  • 1
  • 13