I was just playing around with JavaScript and got stuck with a simple program.
I declared an array in JavaScript like
var a = [0, 1, 2];
Then as there is no fixed size for an array in JavaScript and we can add more to the array, I added another integer to array.
a[3] = 3;
And as expected If I try to access a[4]
I am definitely going to get it as undefined
.
Now, if I take an array
var a = [0,1,2];
And add another element
a[4] = 4;
I have intentionally not defined a[3]
, and this also gives me a[3]
as undefined
.
Here is a fiddle where this can be observed: http://jsfiddle.net/ZUrvM/
Now, if I try the same thing in Java,
int[] a = new int[4];
a[0] = 0;
a[1] = 1;
a[3] = 3;
Then I end up with
a[2] = 0;
You can see this on ideone: https://ideone.com/WKn6Rf
The reason for this in Java I found is that the four variables are defined while declaring the array and we can only assign values to the declared size of array.
But in JavaScript when I declare an array of size 3 and then add 5th element why does it not consider the 4th element to be null
or 0
if we have increased the array size beyond 4?
Why do I see this strange behavior in JavaScript, but not in other languages?