2

I'm using this asynchronous recursive function to iterate through a directory's files and folders, and when a .css file is found, I append data to a file called 'common.css'.

var walk = function(dir, done) {
 var results = [];

 fs.readdir(dir, function(err, list) {
      if (err) return done(err);
      var pending = list.length;
      if (!pending) return done(null);
      list.forEach(function(file) {
           file = dir + '/' + file;
           fs.stat(file, function(err, stat) {
                if (stat && stat.isDirectory()) {
                     minimizer(file, function(err, res) {
                          results = results.concat(res);
                          if (!--pending) done(null);
                     });
                } else if(file) {
                     fs.open('common.css', 'a', 666, function( err, id ) {
                         fs.write( id, 'hello', null, 'utf8', function(){
                             fs.close(id, function(){
                                  console.log('file closed');
                             });
                         });
                     });
                }
                if (!--pending) done(null);
           });  
      });
});   }

The problem is that, being asyncronous, I sense that there are times, where several instances of the function are writing to the file at the same time. Is there any way I can control this flux and queue the writing tasks?

The purpose of this code is to merge the .css files that are found in the directory. I cannot use external tools in this project.

EDIT ANSWER: Well, for this purpose, I can just gather the paths of all .css files I want to merge, and after I have them all, call a syncronous function to write them.

That's what the

var results = []

is there for. I just realized it.

tshepang
  • 12,111
  • 21
  • 91
  • 136
F. Santiago
  • 842
  • 10
  • 18
  • In the example you're openning and writting hello in common.css for every file in the directory, so if you have 3 files you'll write "hellohellohello". Is this your real code? For this purpose you can just get the number of files and then opening and writting only one time "hello" N times. – Gabriel Llamas Apr 30 '12 at 21:19

4 Answers4

1

I think that I would look into using something like the async.js library which has capabilities for making sure things happen in the right order, including async.forEachSeries(arr, eachFn, doneFn)

Jamund Ferguson
  • 16,721
  • 3
  • 42
  • 50
Julian Knight
  • 4,716
  • 2
  • 29
  • 42
0

My suggestion would be if you are doing this as a standalone application - not part of some much larger app, you could write all of your css files to console.log and invoke by "node combine.js > combined.cc". From what I've seen using FileStreams and console.log, it doesn't mishmash writes.

ControlAltDel
  • 33,923
  • 10
  • 53
  • 80
  • Well, for this purpose, I can just gather the paths of all .css files I want to merge, and after I have them all, call a syncronous function to write them -.- thanks for your answer anyway user1291492! If this doesn't work, I'll see into your suggestion :) It is a pretty large web application. – F. Santiago Apr 30 '12 at 15:21
  • Do you have any suggestion for knowing exactly when all the function calls have ended? Knowing exactly when the last file has been found. – F. Santiago Apr 30 '12 at 15:35
  • You can't know if you've found the last file until you've finished searching all directories. How do you know if you've searched all directories? It's tough to know this in node's callback framework – ControlAltDel Apr 30 '12 at 15:52
0

You could open common.css at the beginning and keep the file descriptor open, then call fs.write(id, ...) whenever you encounter a CSS file. This is instead of your current method of re-opening common.css each time.

In your code, it looks like you are writing hello to the file. If that is literally true, or if what you’re writing is relatively short (typically 512 or 4096 bytes depending on platform), all the appends to common.css will be atomic so different asynchronous functions writing at the same time won’t interfere with each other.

Guan Yang
  • 1,202
  • 8
  • 6
  • I tried your suggestion, but the common.css isn't being written correctly. I managed to create a syncronous loop to do this, so the problem isn't multiple instances being written at the same time any more. It is probably about the appending function. I must not be working well. I see that the fs.write(); function of node.js takes a buffer and not a string. I'll try that. – F. Santiago May 02 '12 at 08:45
0

Well, I ended up using a serial recursive function for the same purpose. It solved the problem . In the answer of user "chjj" in taken from node.js fs.readdir recursive directory search

var fs = require('fs');
var walk = function(dir, done) {
  var results = [];
  fs.readdir(dir, function(err, list) {
    if (err) return done(err);
    var i = 0;
    (function next() {
      var file = list[i++];
      if (!file) return done(null, results);
      file = dir + '/' + file;
      fs.stat(file, function(err, stat) {
        if (stat && stat.isDirectory()) {
          walk(file, function(err, res) {
            results = results.concat(res);
            next();
          });
        } else {
          results.push(file);
          next();
        }
      });
    })();
  });
};

Thank you all for your suggestions :)

Community
  • 1
  • 1
F. Santiago
  • 842
  • 10
  • 18