1

It is my understanding that -- from a performance perspective -- direct assignment is more desirable than .push() when populating an array.

My code is currently as follows:

for each (var e in Collection) {
  do {
    DB_Query().forEach(function(e){data.push([e.title,e.id])});
  } while (pageToken);
}

DB_Query() method runs a Google Drive query and returns a list.

My issue arises because DB_Query() can return a list of variable length. As such, if I construct data = new Array(100), direct assignment has the potential to go out of bounds.

Is there a method by which I could try and catch an Out of Bounds exception to have values directly assigned for the 100 pre-allocated indices, but use .push() for any overflow? The expectation here is that an OOB exception will not occur often.

Also, I'm not sure if it matters, but I am clearing the array after a counter variable is >=100 using the following method:

while(data.length > 0) {data.pop()}
toolshed
  • 1,919
  • 9
  • 38
  • 50

1 Answers1

0

In Javascript, if you set a value at an index bigger than the array length, it'll automatically "stretch" the array. So there's no need to bother with this. If you can make a good guess about your array size, go for it.

About your clearing loop: that's correct, and it seems that pop is indeed the fastest way. My original suggestion was to set the array length back to zero: data.length = 0;

Now a tip that I think really makes a performance difference here: you're worrying with the wrong part!

In Apps Script, what takes long is not resizing arrays dynamically, or working your data, that's fast. The issue is always with the "API calls". That is, using UrlFetch or Spreadsheet.Range.getValue and so on.

You should take care to make the minimum amount of API calls possible and in your case (I'm guessing now, since I haven't seen your whole code) you seem to be doing it wrong. If DB_Query is costly (in API calls terms) you should not have it nested under two loops. The best solution usually involves figuring out everything you'll need before-hand (do as many loops you need, if it doesn't call anywhere), then pass all parameters to do a bulk operation and gather it all at once (in one API call), even if it involves getting more data than you needed. Then, with the whole data at hand, loop through and transform it as required (that's the fast part).

Henrique G. Abreu
  • 17,406
  • 3
  • 56
  • 65
  • Popping values off the array is apparently more efficient according to http://stackoverflow.com/questions/1232040/how-to-empty-an-array-in-javascript. Seems counter-intuitive, but the benchmarks substantiate it. From an implementation standpoint, I'm interested in why this is. – toolshed Jul 21 '14 at 15:26
  • Regarding API calls, I originally tried to minimize them by retrieving a search result yielding metadata for all folders in Google Drive and iterating over them. I noted, however, that many folders were excluded from the result, and haven't been able to isolate the reason for this. But, that is probably an issue for another SO post. Perhaps the request is timing out because it returns to many results? Either way, I've confirmed that all PageTokens are iterated through, and yet some of the expected values are missing from the data set. – toolshed Jul 21 '14 at 15:38
  • I edited my answer to include that poping the elements is indeed the fastest method. And yeah, seems counter-intuitive. – Henrique G. Abreu Jul 21 '14 at 15:59