0
  var inc = .001;
  var z = new Array(1.0/inc);
  for (var x = 0.0; x < 1.0; x += inc) {
    z.push(Math.cos(x));
  }
  var y = new Array(1.0/inc);
  for (x = 0.0; x < 1.0; x += inc) {
    y.push(1 - ((x * x) / 2) + ((x * x * x * x) / 24));
  }
  var sum = 0;
  for (var i = 0; i < (1.0/inc); i++) {
    sum += y[i] - z[i];
  }
  console.log(sum);
  console.log(sum/(1.0/inc));

I'm pretty new to Javascript, but the arrays here are filled with floats and when I take the difference and try to print them it returns NaN. I'm stumped here. Here's a fiddle with the code (http://jsfiddle.net/2v7wu/). Thanks!

  • explain what you expect `var z = new Array(1.0/inc);` to generate....it's creating aarray with 1000 eleemtns that are all undefined. Tru `console.log(z)` right after you declare it. doubt it's what you wanted to do – charlietfl Nov 23 '13 at 15:24
  • @charlietfl: Enh, 1000 entries. – T.J. Crowder Nov 23 '13 at 15:25

3 Answers3

3

You're creating arrays that consist of 1000 empty values, and then pushing extra elements onto those. Your arrays end up 2000 elements long, of which you iterate over the first (empty) 1000.

You don't need to declare the length of arrays in Javascript, so just using

var z = []
var y = []

will be fine.

Finally, you need to change your array index in the last loop to

sum += y[i] - z[i];
Gareth
  • 133,157
  • 36
  • 148
  • 157
  • I think there is also an issue here as to how javascript does floating point numbers as well. I'm betting your sums won't be what you want – dbarnes Nov 23 '13 at 15:23
  • Maybe, but they won't be `NaN` at least – Gareth Nov 23 '13 at 15:25
  • @Gareth: I don't know what I was thinking, of course `a.push(value)` is basically `a[a.length] = value`. But it's important to understand that it's not that the other entries are "empty", it's that *they aren't there at all* (because these aren't really arrays). `var a = new Array(3); a.push("x");` gives you an array with **one** entry and a `length` of `4`. – T.J. Crowder Nov 23 '13 at 15:31
  • @T.J.Crowder absolutely, and if OP were iterating using `forEach` then there wouldn't be a problem (other than having a messy array in the first place) – Gareth Nov 23 '13 at 15:34
  • I was actually looking for a way to iterate over two arrays with forEach in Javascript. Is there a way? Also, @dbarnes, what would the floating point error be? – user3025173 Nov 23 '13 at 15:42
  • @user3025173 http://stackoverflow.com/questions/588004/is-javascripts-floating-point-math-broken and the workaround is simple just multiply all the decimals by some factor (IE 1000) then do your work then divide by the same factor(IE 1000) – dbarnes Nov 23 '13 at 16:05
2

Because you're using x where you mean to be using i:

for (var i = 0; i < (1.0/inc); i++) {
  sum += y[x] - z[x];
  //       ^------^--------- these should be i, not x
}

See also Gareth's point below, which relates to my side note #2 below. By starting out with an initial length and then using push, you're not putting things where you think you are. :-)


Two side notes:

  1. You're constantly re-evaluating 1.0/inc. I recommend doing that once and storing it in a variable.

  2. In JavaScript, there's rarely any reason to write new Array(length). Just use var z = []; and var y = []; These arrays aren't really arrays at all and they'll "grow" as needed (but again, without lots of memory reallocations, because they're not really arrays).

Community
  • 1
  • 1
T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
0

In your third loop, you're trying to use the floating point value x as an index into y and z. This produces the result undefined, and undefined - undefined evaluates to NaN.

Presumably, you meant to use i as the index.

Jose Torres
  • 347
  • 1
  • 3