3

For Example:

var arr = [];
arr[3.4] = 1;
console.log(arr.length);

In the above code sample the length property holds zero why and what happened inside JS parser because it's length is zero.

gvgvgvijayan
  • 1,851
  • 18
  • 34
  • Please look into this [Length of a JavaScript object](http://stackoverflow.com/questions/5223/length-of-a-javascript-object-that-is-associative-array) – NikhilGoud Jan 08 '16 at 13:47

4 Answers4

10

An array's length is reflective of the largest array index present in the array. An "array index" (see below) is a property name that is an integer value less than 232−1. Since 3.4 is not an integer, setting it does not alter the length.

Arrays are objects, so there is no reason why an array can't have a property named 3.4, but that property doesn't influence the length because its name does not fit the criteria of an array index.

ES2015 9.4.2, Array Exotic Objects defines an "array index" as an integer less than 232−1:

A property name P (in the form of a String value) is an array index if and only if ToString(ToUint32(P)) is equal to P and ToUint32(P) is not equal to 232−1."

And that definition is used in relation to the length value (emphasis mine):

Every Array object has a length property whose value is always a nonnegative integer less than 232. The value of the length property is numerically greater than the name of every own property whose name is an array index; whenever an own property of an Array object is created or changed, other properties are adjusted as necessary to maintain this invariant. Specifically, whenever an own property is added whose name is an array index, the value of the length property is changed, if necessary, to be one more than the numeric value of that array index...

apsillers
  • 112,806
  • 17
  • 235
  • 239
  • `length` reflects the largest array index, not the largest integer index. For example, `Object.assign([], {[1e53]:1}).length === 0`, but `Object.assign([], {[99]:1}).length === 100` – Oriol Jan 01 '17 at 04:55
  • @Oriol Yes, I tried to make that clear in the second part of the answer, but I agree that the first sentence was inarguably incorrect. I wanted to communicate "if a key isn't an integer then it's ineligible to fit the definition of an array index" but I fell short of that. I hope my edit is better. – apsillers Jan 01 '17 at 06:00
2

A JavaScript array can not have fractional indexes.

What you have done there is assigned a property called "3.4" on the array object.

This does not affect the length property, which is designed to return one number higher than the highest valid index.

If you think about how arrays are supposed to work, you should realise a fractional offset doesn't make sense.

alex
  • 479,566
  • 201
  • 878
  • 984
1

You could try using an object vs an array.

var arr = {};
arr[3.4] = 1;
arr[3.5] = 2;
arr[3.6] = 3;
console.log(Object.keys(arr).length);
TimD
  • 342
  • 2
  • 11
0

You must use an integer as an index, not a float.

var arr = []; arr[3] = 1; console.log(arr.length);

Pascal Le Merrer
  • 5,883
  • 20
  • 35