0

Why matrix in [i+1] is not defined meanwhile it works properly for matrix [i]

 function matrixElementsSum(matrix) {
     let x = 0;
     let i = 0;
     let j=0;
     for (i=0; i<matrix.length; i++) {
         for (j=0; j<matrix[i].length; j++) {

         if (matrix[i][j] === 0) {
             matrix[i+1][j]=0;
            //Cannot set property '0' of undefined
         }
         x = matrix[i].reduce(function(a,b){return a+b},0);
         x +=x;
      }

     }
    return x;
}
Victor M Perez
  • 2,185
  • 3
  • 19
  • 22
Kahnem
  • 1
  • 1
    Think about the case when `i` is `matrix.length-1` at that time there does not exists `i+1` – void Feb 14 '18 at 11:15
  • In the very last iteration of the outer loop, when `i == matrix.length - 1`, then `i + 1` will be out of bounds. – Some programmer dude Feb 14 '18 at 11:15
  • Possible duplicate of [Why is using "for...in" with array iteration a bad idea?](https://stackoverflow.com/questions/500504/why-is-using-for-in-with-array-iteration-a-bad-idea) – Ankit Patidar Feb 14 '18 at 11:25

2 Answers2

0

The index - i - should go until matrix.length - 1 in your

    for (i=0; i<matrix.length; i++) {
       ///ETC
Victor M Perez
  • 2,185
  • 3
  • 19
  • 22
0

Why matrix in [i+1] is not defined meanwhile it works properly for matrix [i]

Simply because of your for-loop condition

for (i=0; i<matrix.length; i++) {

value of i can go upto matrix.length -1, and matrix[ matrix.length ] will always be undefined as array indexing starts from 0.

gurvinder372
  • 66,980
  • 10
  • 72
  • 94