This function should initialize arrays C[m+1][n+1] and B[m][n] and fill the first row and first column of C with zeros. Note: int*** C refers to a pointer points to the 2D integer array. Please correct the error.
void initLCSTable(int*** C, char*** B, int m, int n)
{
C[m + 1][n + 1] = {{0}};
B[m][n];
}
void printLengthTable(int** C, int m, int n);
void printArrowTable(char** B, int m, int n);
main {
int** C;
char** B;
initLCSTable(&C, &B, m, n);
cout << "\nTable C" << endl;
printLengthTable(C, m, n);
cout << "\nTable B" << endl;
printArrowTable(B, m, n);
return 0;
}
Output should look like this (ignore the non-zeros since that is a different Longest Subsequence question all together): https://i.stack.imgur.com/ElWMY.png