0

I'm new to node.js and I'm wondering if it is possible to iterate in a loop synchronously. Let's say that in a for loop I call a blocking function (reading, writing) and I want that the for executes synchronously (wait for first iteration to finish, then do the second one, ...)? I saw some tutorials about async module but no one covered this problem.

SebiSebi
  • 295
  • 8
  • 17
  • `for` synchronous, and if you call only blocking functions inside the for block the whole loop will be synchronous as well. I really didn't get you issue... – Felipe Sabino Jun 26 '15 at 15:59
  • Why would the loop not be synchronous by default? – Mark Meyer Jun 26 '15 at 15:59
  • This needs more information. – James G. Jun 26 '15 at 16:02
  • If at each iteration I read a big set of data, then node.js will do this: runs the first reading attempt, then it does not wait for this to finish, and runs the second reading (iteration 2). I want that after running the first reading it waits for it to finish and then runs the second one, and so on. – SebiSebi Jun 26 '15 at 16:07
  • @SebiSebi if your `for` loop is iterating an array, you should check out [`async#eachSeries`](https://github.com/caolan/async#eachseriesarr-iterator-callback). – robertklep Jun 26 '15 at 17:18
  • "runs the first reading attempt, then it does not wait for this to finish" this means that whatever you're calling inside the loop **isn't** synchronous. "I want that after running the first reading it waits for it to finish and then runs the second one, and so on." So that means you don't actually want to iterate in a synchronous loop, what you want is **asynchronous** loop iteration. – idbehold Jun 26 '15 at 17:20

2 Answers2

5

You shouldn't use a for loop for this. The for loop will fire off each of the async functions simultaneously which is obviously not what you want. You should create your own system for calling these. Try something like this:

  1. You'll want an array to hold all of the URL's that you want to hit
  2. You'll want to create a function that calls the first URL in the list and waits for the response
  3. In the callback function to this call, once all of the asynchronous work is done, call the same function to restart process until it is out of URL's to hit

Should look a little like this:

var urls = new Array("google.com", "youtube.com", "stackoverflow.com");

function callFirstInArray() {
   var url = urls.shift(); //removes the first from the array, and stores in variable 'url'

   //do ajax work, and set callback function:
   $.ajax(/*do stuff with the url variable*/, function () { 
      if (urls.length > 0) {   //callback
         callFirstInArray();
      };
   });
};

Yours may not work exactly how I described it here, but it should use a similar concept to it.

Note: You will probably need an extra function scope to protect your original array of URL's because .shift() will mutate the original array

Justin Hathaway
  • 239
  • 1
  • 8
  • Ok. It is possible to handle this situation using async? – SebiSebi Jun 26 '15 at 16:32
  • What do you mean by that? This solution handles the complications that come with using asynchrony – Justin Hathaway Jun 26 '15 at 16:37
  • I apologize, I'm not familiar with the async module you're talking about. – Justin Hathaway Jun 26 '15 at 16:46
  • I have found other posts where people are saying this solution leads to stack overflows. It will work for a small array, but for literally millions of elements in the array, it will break the runtime. Example: https://stackoverflow.com/questions/15902211/nodejs-making-the-while-loop-synchronous – danivicario Apr 25 '17 at 21:15
0

You mention that you are reading and writing files, are you using fs.readFile and fs.writeFile? These are asynchronous methods, and you would need to invoke one after the other in callbacks.

If you do want to use a loop, take a look at the synchronous versions of those methods fs.readFileSync and fs.writeFileSync.

thgaskell
  • 12,772
  • 5
  • 32
  • 38