125

Arrays in JavaScript are very easy to modify by adding and removing items. It somewhat masks the fact that most languages arrays are fixed-size, and require complex operations to resize. It seems that JavaScript makes it easy to write poorly performing array code. This leads to the question:

What performance (in terms of big O time complexity) can I expect from JavaScript implementations in regards to array performance?

I assume that all reasonable JavaScript implementations have at most the following big O's.

  • Access - O(1)
  • Appending - O(n)
  • Prepending - O(n)
  • Insertion - O(n)
  • Deletion - O(n)
  • Swapping - O(1)

JavaScript lets you pre-fill an array to a certain size, using new Array(length) syntax. (Bonus question: Is creating an array in this manner O(1) or O(n)) This is more like a conventional array, and if used as a pre-sized array, can allow O(1) appending. If circular buffer logic is added, you can achieve O(1) prepending. If a dynamically expanding array is used, O(log n) will be the average case for both of those.

Can I expect better performance for some things than my assumptions here? I don't expect anything is outlined in any specifications, but in practice, it could be that all major implementations use optimized arrays behind the scenes. Are there dynamically expanding arrays or some other performance-boosting algorithms at work?

P.S.

The reason I'm wondering this is that I'm researching some sorting algorithms, most of which seem to assume appending and deleting are O(1) operations when describing their overall big O.

Anurag Baundwal
  • 128
  • 1
  • 10
Kendall Frey
  • 43,130
  • 20
  • 110
  • 148
  • 6
    The Array constructor with a size is pretty much useless in modern JavaScript implementations. It does almost nothing at all in that single parameter form. (It sets `.length` but that's about it.) Arrays are really not much different from plain Object instances. – Pointy Jul 17 '12 at 00:02
  • 3
    Setting the `length` property and pre-allocating space are two completely different things. – Pointy Jul 17 '12 at 00:04
  • 1
    @Pointy: Am I expecting too much when I expect setting `array[5]` on a `new Array(10)` is O(1)? – Kendall Frey Jul 17 '12 at 00:06
  • It's probably not quite O(1), but it's like adding an element to a hashtable more than it is a linear array re-allocation. – Pointy Jul 17 '12 at 00:06
  • Ouch. Really? Is there no way to pre-size an array at all? – Kendall Frey Jul 17 '12 at 00:09
  • 1
    While the ECMAScript does *not* define how an Array object is implemented (it only defines some semantic rules), it is very possible that different implementations will optimize for expected cases (e.g. have a "real array" backing for arrays less than some n in size). I am not that savvy on implementations, but would be *really surprised* if this was not done somewhere ... –  Jul 17 '12 at 00:19
  • @pst: That's the viewpoint I had. – Kendall Frey Jul 17 '12 at 00:20
  • 5
    @KendallFrey "Best answer" is likely to write some jsperf test-cases for different n / access patterns and see what comes of it ;-) –  Jul 17 '12 at 00:21
  • It would be surprising to see an array which had O(n) for both appending and prepending. – argentage Jul 17 '12 at 14:24
  • @Pointy setting the length property *will* prealocate space. `let a = []; a.length = 12` will create an array of `undefined` x 12. Did I misunderstand you comment ? – Ced Jun 15 '17 at 01:15
  • @Ced it's not necessarily the case that setting the `length` property will actually allocate storage space. For example, try a test program that initializes 1000 arrays and sets the `length` property of each one to `2000000000`. – Pointy Jun 18 '17 at 19:49
  • @Ced An implementation which wanted to optimise for cases like that could use a sparse array (array of arrays) so only those sections of the array which are used ever get allocated. – Michael Grazebrook Sep 29 '20 at 15:44

2 Answers2

132

NOTE: While this answer was correct in 2012, engines use very different internal representations for both objects and arrays today. This answer may or may not be true.

In contrast to most languages, which implement arrays with, well, arrays, in Javascript Arrays are objects, and values are stored in a hashtable, just like regular object values. As such:

  • Access - O(1)
  • Appending - Amortized O(1) (sometimes resizing the hashtable is required; usually only insertion is required)
  • Prepending - O(n) via unshift, since it requires reassigning all the indexes
  • Insertion - Amortized O(1) if the value does not exist. O(n) if you want to shift existing values (Eg, using splice).
  • Deletion - Amortized O(1) to remove a value, O(n) if you want to reassign indices via splice.
  • Swapping - O(1)

In general, setting or unsetting any key in a dict is amortized O(1), and the same goes for arrays, regardless of what the index is. Any operation that requires renumbering existing values is O(n) simply because you have to update all the affected values.

Jonas Wilms
  • 132,000
  • 20
  • 149
  • 151
Nick Johnson
  • 100,655
  • 16
  • 128
  • 198
  • 4
    Shouldn't prepend be O(n)? Since all the indices need to be shifted. Same for insertion and deletion (at arbitrary index, and shift/collapse the elements). – nhahtdh Jul 18 '12 at 06:05
  • 2
    Also, is `length` set on the Array mutation, or is the `get` on it going to get the length and possibly memoize it? – alex Jul 18 '12 at 06:41
  • @nhahtdh Javascript doesn't have 'prepend', 'insert', or 'delete' operations that reassign array indices. It does have a 'splice' operation which does that, and is (obviously) O(n). – Nick Johnson Jul 18 '12 at 07:26
  • For `length`, see [here](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/#Relationship_between_length_and_numerical_properties). – Nick Johnson Jul 18 '12 at 07:28
  • @NickJohnson: There is prepend: `unshift`. I can't see how your answer has anything to do with the question. – nhahtdh Jul 18 '12 at 08:22
  • @nhahtdh Fair enough, I missed unshift. I don't know why you're having difficulty relating my answer to the question - it seems like a pretty straightforward thing to me. – Nick Johnson Jul 18 '12 at 12:54
  • @NickJohnson: I mentioned this in my first comment: I don't get how insertion and deletion are O(1) amortized. The only kind of "insert" and "delete" that can have O(1) amortized complexity is assigning value to the array. – nhahtdh Jul 18 '12 at 12:58
  • 34
    Worth mentioning this answer is no longer correct. Modern engines do not store Arrays (or objects with indexed integer keys) as hashtables (but like well... arrays like in C) unless they're sparse. To get you started [here is a 'classical' benchmark illustrating this](http://jsperf.com/packed-vs-holey-arrays) – Benjamin Gruenbaum Oct 29 '13 at 19:13
  • @BenjaminGruenbaum As far as I know, this doesn't affect the complexity or behaviour, only the constant factor (which is beyond the scope of the question). – Kendall Frey Oct 29 '13 at 19:23
  • You're right, this answer corrects incorrect information that is beyond the scope of the question. I just wanted to point that out - didn't downvote or anything. – Benjamin Gruenbaum Oct 29 '13 at 19:24
  • 1
    Can we get a reference to this? I'd love to learn more about how this was derived. – badunk Nov 05 '13 at 01:50
  • 1
    @badunk The values are all just those for a hashtable, which you can look up big-O analysis for on Wikipedia or elsewhere. – Nick Johnson Nov 05 '13 at 09:58
  • 5
    Is this defined by the standard or is this just a common implementation in JS engines? What's about V8? – Albert Mar 24 '14 at 16:10
  • 5
    @BenjaminGruenbaum it would be nice if you could develop a bit on how they are stored. Or give some sources. – Ced Jun 15 '17 at 01:23
  • I'm still confused. Is new Array(100000) slower than new Array(10) ? – Squirrl Apr 09 '18 at 06:25
  • Is it fair to say this means it acts like an efficient stack/FIFO (but not as a queue/FILO)? – Andy Hayden Feb 19 '19 at 19:51
  • 1
    @BenjaminGruenbaum so what's the answer now? – Andy Hayden Feb 20 '19 at 03:27
  • 2
    What does amortized mean? So does this mean deletion via `.pop` is O(1)? – Noitidart Mar 23 '19 at 17:24
5

guarantee

There is no specified time complexity guarantee for any array operation. How arrays perform depends on the underlying datastructure the engine chooses. Engines might also have different representations, and switch between them depending on certain heuristics. The initial array size might or might not be such an heuristic.

reality

For example, V8 uses (as of today) both hashtables and array lists to represent arrays. It also has various different representations for objects, so arrays and objects cannot be compared. Therefore array access is always better than O(n), and might even be as fast as a C++ array access. Appending is O(1), unless you reach the size of the datastructure and it has to be scaled (which is O(n)). Prepending is worse. Deletion can be even worse if you do something like delete array[index] (don't!), as that might force the engine to change its representation.

advice

Use arrays for numeric datastructures. That's what they are meant for. That's what engines will optimize them for. Avoid sparse arrays (or if you have to, expect worse performance). Avoid arrays with mixed datatypes (as that makes internal representations more complex).

If you really want to optimize for a certain engine (and version), check its sourcecode for the absolute answer.

Ry-
  • 218,210
  • 55
  • 464
  • 476
Jonas Wilms
  • 132,000
  • 20
  • 149
  • 151
  • Wait for a second, we can have arrays with mixed datatypes? Javascript is so cool! – Anurag Baundwal Jun 04 '20 at 08:54
  • 1
    @Anurag exactly, but in 99% of cases you wouldn't need this feature – Desiigner Jul 04 '20 at 08:21
  • @Jonas-Wilms, could you please elaborate on the `unless you reach the size of the datastructure and it has to be scaled` part? We don't usually set array length on array creation in JS, does it mean that appending to any array created the regular way (like `let a = [1,2,3]`) requires datastructure scaling? – grreeenn Apr 07 '21 at 00:00
  • @grreeenn I guess not, the engine will probably directly allocate an array list with larger size – Jonas Wilms Apr 07 '21 at 12:22