5

When using the new Array(size) ctor, if size is not a constant, JS seems to create a sparse array in some browsers (at least in Chrome), causing access to be much slower than when using the default ctor, as shown here.

That is exactly the opposite of what I want: I pre-allocate an array of given size to avoid dynamic re-allocation and thereby improving performance. Is there any way to achieve that goal?

Please note that this question is not about the ambiguity of the new Array(size) ctor. I posted a recommendation on that here.

Community
  • 1
  • 1
Domi
  • 22,151
  • 15
  • 92
  • 122

2 Answers2

4

100000 is 1 past the pre-allocation threshold, 99999 still pre-allocates and as you can see it's much faster

http://jsperf.com/big-array-initialize/5

Esailija
  • 138,174
  • 23
  • 272
  • 326
  • Thanks for the info. Is there no way to fix that? Shouldn't there be a simple flag to override that option in every JS instance? Else, if the data sent by the server happens to surpass this incredibly arbitrary number, I get in trouble, big trouble. – Domi Jan 04 '14 at 13:41
  • @Domi That would be the fault of the application code that doesn't validate user input. The threshold has to do with implementation details of V8 garbage collector and heap, so there is no way to make it bigger. – Esailija Jan 04 '14 at 14:56
  • Would you mind posting an official link for that? And, do you happen to know how would one go about getting this fixed in future JS versions? – Domi Jan 04 '14 at 18:01
  • 1
    @Domi this is not about Javascript but V8. Javascript arrays in the language do not imply any kind of allocation or even that they are backed by arrays even - in fact if you read the [specification for Javascript arrays](http://www.ecma-international.org/ecma-262/5.1/#sec-15.4) the obvious implementation would be a hash table. Closest to official link is the source code you can read https://github.com/v8/v8 . – Esailija Jan 04 '14 at 18:33
4

Pre-allocated vs. dynamically grown is only part of the story. For preallocated arrays, 100,000 just so happens to be the threshold where V8 decides to give you a slow (a.k.a. "dictionary mode") array.

Also, growing arrays on demand doesn't allocate a new array every time an element is added. Instead, the backing store is grown in chunks (currently it's grown by roughly 50% each time it needs to be grown, but that's a heuristic that might change over time).

You can find more information ..here.Thanks ..:)

Esailija
  • 138,174
  • 23
  • 272
  • 326
Avinash Babu
  • 6,171
  • 3
  • 21
  • 26
  • I did not make any contradictory claim. The link that I provided explains how dynamic allocation keeps the operation count at O(1). – Domi Jan 04 '14 at 13:39
  • 1
    As Esailija did for you here, you **must** place any copied wording in blockquotes to indicate that you did not write this. Any answers that I see that do not follow this rule will be deleted. – Brad Larson Jan 04 '14 at 19:42