48

In Javascript, if I do something like

var alpha = [];
alpha[1000000] = 2;

does this waste memory somehow? I remember reading something about Javascript arrays still setting values for unspecified indices (maybe sets them to undefined?), but I think this may have had something to do with delete. I can't really remember.

Nick
  • 5,228
  • 9
  • 40
  • 69
  • 1
    It depends on the implementation. – Gabe Dec 24 '10 at 03:52
  • I can't say for sure, but I remember having IE7 getting slower and slower when I was using an array to store component ids. I would delete the items, but I never re-used those indexes. The problem when away when I wrote a class that would re-use them. – Hemlock Dec 24 '10 at 03:53

5 Answers5

30

See this topic: are-javascript-arrays-sparse

In most implementations of Javascript (probably all modern ones) arrays are sparse. That means no, it's not going to allocate memory up to the maximum index.

If it's anything like a Lua implementation there is actually an internal array and dictionary. Densely populated parts from the starting index will be stored in the array, sparse portions in the dictionary.

Community
  • 1
  • 1
patros
  • 7,719
  • 3
  • 28
  • 37
  • "there is actually an internal array and dictionary..." -- that is roughly what I found when I did some testing. The internal array is used for densely populated arrays that start from index 0; otherwise a hash table is used. – Dan Breslau Dec 24 '10 at 04:07
  • +1 In the book, "JavaScript: The Definitive Guide (5th Edition)," part 7.6.1 (pg 115) -- explains that arrays may be sparse and JS implementations should allocate memory only for those elements that are actually stored in the array. Great reference book, btw! – Kai Dec 24 '10 at 04:24
26

This is an old myth. The other indexes on the array will not be assigned.

When you assign a property name that is an "array index" (e.g. alpha[10] = 'foo', a name that represents an unsigned 32-bit integer) and it is greater than the current value of the length property of an Array object, two things will happen:

  1. The "index named" property will be created on the object.
  2. The length will be incremented to be that index + 1.

Proof of concept:

var alpha = [];
alpha[10] = 2;
alpha.hasOwnProperty(0);  // false, the property doesn't exist
alpha.hasOwnProperty(9);  // false
alpha.hasOwnProperty(10); // true, the property exist
alpha.length;             // 11

As you can see, the hasOwnProperty method returns false when we test the presence of the 0 or 9 properties, because they don't exist physically on the object, whereas it returns true for 10, the property was created.

This misconception probably comes from popular JS consoles, like Firebug, because when they detect that the object being printed is an array-like one, they will simply make a loop, showing each of the index values from 0 to length - 1.

For example, Firebug detects array-like objects simply by looking if they have a length property whose its value is an unsigned 32-bit integer (less than 2^32 - 1), and if they have a splice property that is a function:

console.log({length:3, splice:function(){}});
// Firebug will log: `[undefined, undefined, undefined]`

In the above case, Firebug will internally make a sequential loop, to show each of the property values, but no one of the indexes really exist and showing [undefined, undefined, undefined] will give you the false sensation that those properties exist, or that they were "allocated", but that's not the case...

This has been like that since ever, it's specified even of the ECMAScript 1st Edition Specification (as of 1997), you shouldn't worry to have implementation differences.

Zsolt Meszaros
  • 21,961
  • 19
  • 54
  • 57
Christian C. Salvadó
  • 807,428
  • 183
  • 922
  • 838
  • 2
    This is more of a description of the __logical__ behavior of an array, rather than a guide to the runtime performance of arrays. I haven't seen anything in the spec that constrains memory or CPU consumption to any particular model (not that my reading of the spec is definitive.) – Dan Breslau Dec 24 '10 at 05:29
  • 1
    Just to add to that, even if you add a property name that can be converted to an unsigned integer, the length property on the array will be incremented (if needed). (e.g. `var a = []; a["121"] = 1 ` increments `a.length` to 122. – Bharat Khatri Jan 16 '14 at 08:01
2

About a year ago, I did some testing on how browsers handle arrays (obligatory self-promotional link to my blog post.) My testing was aimed more at CPU performance than at memory consumption, which is much harder to measure. The bottom line, though, was that every browser I tested with seemed to treat sparse arrays as hash tables. That is, unless you initialized the array from the get-go by putting values in consecutive indexes (starting from 0), the array would be implemented in a way that seemed to optimize for space.

So while there's no guarantee, I don't think that setting array[100000] will take any more room than setting array[1] -- unless you also set all the indexes leading up to those.

Dan Breslau
  • 11,472
  • 2
  • 35
  • 44
  • So do you think this would be an ideal way to reference some objects by id? – Nick Dec 24 '10 at 04:09
  • @Nick: I don't know what your range of ids is, but I would not hesitate to use an array simply because it might have some large index values. In any case, are there really any alternatives that don't also result in implicitly or explicitly using a hash table? – Dan Breslau Dec 24 '10 at 04:13
0

I dont think so because javascript treats arrays kinda like dictionaries, but with integer keys.

alpha[1000000] = alpha["1000000"]

keegan3d
  • 10,357
  • 9
  • 53
  • 77
-11

I don't really know javascript, but it would be pretty odd behaviour if it DIDN'T allocate space for the entire array. Why would you think it wouldn't take up space? You're asking for a huge array. If it didn't give it to you, that would be a specific optimisation.

This obviously ignores OS optimisations such as memory overcommit and other kernel and implementation specifics.

Falmarri
  • 47,727
  • 41
  • 151
  • 191
  • It doesn't allocate space for the entire array. Think of it this way: JavaScript was originally intended for use by casual developers. No performance guarantees were written into the language. As a result, JavaScript implementers are basically forced to make time/space compromises, because the developers aren't. – Dan Breslau Dec 24 '10 at 04:16
  • I guess this was my point. You should EXPECT it to allocate space. If it does or doesn't, that's just optimization. – Falmarri Dec 24 '10 at 04:20
  • 1
    No, you should not expect it to allocate space. Nothing in the JavaScript language spec says that JavaScript arrays will behave as contiguous chunks of memory. If you care about either CPU or space consumption, you need to use benchmarks -- your own or other folks', if you trust them. (And although I mentioned my own benchmarks elsewhere on this page, I'm not suggesting that they're the last word on the matter.) – Dan Breslau Dec 24 '10 at 04:26
  • 7
    If you don't really know JavaScript, why comment? Your intuition alone is no guarantee of correctness, and is contradicted by both the spec and browser tests. – Tim Down Dec 24 '10 at 09:23