0

I have this function from the async module which is reading from an array of inputs files like:

inputs: ['file1.txt', 'file2.txt']
 map(inputs, fs.readFile,
   (err, contents) => {
      if (err) console.log('Error: ' + err);
      else {
        const data = contents.reduce((a, b) => a + b);
        fs.writeFile(output, data, () => console.log(`Output in file '${output}'`)
        );
      }
   }
);

How can i set a timeout to the fs.readFile call? i want this to be executed after 3 seconds for example. I was trying this for example but its not working, i guess its a syntax problem, that im not writing it like it should:

map(inputs, setTimeout(fs.readFile,3000),
       (err, contents) => {
          if (err) console.log('Error: ' + err);
          else {
            const data = contents.reduce((a, b) => a + b);
            fs.writeFile(output, data, () => console.log(`Output in file '${output}'`)
            );
          }
       }
    );

This should be easy but im stuck. Maybe i cant put the timeout inside the map function? should i create a new function, and instead of calling the fs.readFile, i call my function? Thank you in advance.

FrankDrebbin
  • 125
  • 8
  • how about calling the map function after three second? i.e. wrap all of your code in a setTimeout ... but why do you want to do such a thing ... seems like an odd thing to do – Jaromanda X Feb 21 '20 at 11:30
  • 1
    i want to concatenate the content of those files, in the same order i called them via command line (-c file1.txt file2.txt). if file 1 is very big and file 2 is small, it will put the content of file2 first, and i dont want that, so i want to give the function some time so im sure it finishes. – FrankDrebbin Feb 21 '20 at 11:32
  • ahh, so "3 seconds" is a guess - better off learning how to use asynchrony without a timeout that just guesses how long it will take ... see how fs.writeFile takes a callback ... `fs.writeFile(file, data[, options], callback)` – Jaromanda X Feb 21 '20 at 11:33
  • agree with @JaromandaX, you can use async await – Juhil Somaiya Feb 21 '20 at 11:35
  • Yeah, wait until the first task finishes before you do to next, don't guess how long something will take before proceeding. – goto Feb 21 '20 at 11:35
  • @JuhilSomaiya - no you can't, because fs.writeFile doesn't return a Promise - you'd use `filehandle.writeFile(data, options)` instead – Jaromanda X Feb 21 '20 at 11:35
  • yeah im learning that, thing is first i want to do it like this, then i will find another way to do it with promises or something like that. Im not going to really have a file that big, but i want to know how to do it with the setTimeout, which i know should be pretty easy but i dont know the exact syntax to place it inside the map. – FrankDrebbin Feb 21 '20 at 11:36
  • You can use `util.promisify` to turn the `fs.writeFile` into a `Promise`. – goto Feb 21 '20 at 11:36
  • @goto1 or `filehandle.writeFile(data, options)` - https://nodejs.org/api/fs.html#fs_class_filehandle – Jaromanda X Feb 21 '20 at 11:36
  • @JaromandaX, if i'm understanding correctly we can use, https://nodejs.org/api/fs.html#fs_fs_promises_api check the supportive answer : https://stackoverflow.com/a/58332163/9361289 – Juhil Somaiya Feb 21 '20 at 11:47
  • @JuhilSomaiya - yes, which is not `fs.writeFile` - it's the `FileHandle` stuff I suggested to you, but thanks for uncorrecting me – Jaromanda X Feb 21 '20 at 11:49
  • @JaromandaX it wasn't intentionally, i'd used it in past but wasn't clear that much that whether it was about fileSystem or fileIO. In hurry I searched and posted. – Juhil Somaiya Feb 21 '20 at 11:52
  • All good @JuhilSomaiya - I just thought it was weird you suggested what I suggested as a response :p – Jaromanda X Feb 21 '20 at 11:54
  • No issues, it was a good interaction. Cleared my concept. @JaromandaX – Juhil Somaiya Feb 21 '20 at 11:55

1 Answers1

0

Even if you were to find the correct timing for those files, it would break whenever the file content will change, and the idea of adding timeout is a total antipattern in node.js.

If the files are not humongous, you can read them all and concatenate them afterwards:

const { promisify } = require('util');
const fs = require('fs');
// Create Promise versions of the fs calls we will need
const readFile = promisify(fs.readFile);
const writeFile = promisify(fs.writeFile);

/**
 * @param {string[]} files - Paths to the files to concatene
 * @param {string} destination - Path of the concatenated file
 */
async function concat(files, destination) {
    // Read all files, Promise.all allow to have all operation happening in parallel
    const contents = await Promise.all(files.map(readFile));
    // content.join('') may not work as you want, replace it with whatever concatenation you need
    return writeFile(destination, contents.join(''));
}

If you cannot keep more than one file at a time, you can append them one after the other.

const fs = require('fs');
const { promisify } = require('util');
const readFile = promisify(fs.readFile);
const appendFile = promisify(fs.appendFile);

/**
 * @param {string[]} files - Paths to the files to concatene
 * @param {string} destination - Path of the concatenated file
 */
async function concat(files, destination) {
    // For each file
    for(const file of files) {
        // Read it
        const content = await readFile(file);
        // Append at the end of concatenated file
        await appendFile(destination, content);
    }
}

Didn't test the code, there is probably syntax errors all over the place, but you should get the idea

DrakaSAN
  • 7,673
  • 7
  • 52
  • 94