I got a problem which says:
Write a program which takes 2 digits, X,Y as input and generates a 2-dimensional array. The element value in the i-th row and j-th column of the array should be i*j. Note: i=0,1.., X-1; j=0,1,¡Y-1. Example Suppose the following inputs are given to the program: 3,5 Then, the output of the program should be: [[0, 0, 0, 0, 0], [0, 1, 2, 3, 4], [0, 2, 4, 6, 8]]
.....
I write the following code for it:
var main=[]
function array(x,y){
var n=x
while(n<0){
main.push([0])
n-=1
}
for(let i=0 ; i<x ; i+=1){
for (let j=0 ; j=y ; j+=1){
main[i].push(i*j)
}
main[i].shift()
}
return main
}
console.log(array(2,2))
but I am getting an error in running it in VScode. please guide me