0

I need to find time and space complexity for this java code for recursive calculation of the determinant of a matrix:

    public int determinant(int[][] m) {
    int n = m.length;
    if(n == 1) {
        return m[0][0];
    } else {
        int det = 0;
        for(int j = 0; j < n; j++) {
            det += Math.pow(-1, j) * m[0][j] * determinant(minor(m, 0, j));
        }
        return det;
    }
}

public int[][] minor(final int[][] m, final int i, final int j) {
    int n = m.length;
    int[][] minor = new int[n - 1][n - 1];
    int r = 0, s = 0;
    for(int k = 0; k < n; k++) {
        int[] row = m[k];
        if(k != i) {
            for(int l = 0; l < row.length; l++) {
                if(l != j) {
                    minor[r][s++] = row[l];
                }
            }
            r++;
            s = 0;
        }
    }
    return minor;
}

Help: Determine the number of operations and memory consumption of the algorithm with respect to n, and after dividing by n^2 you will get the desired result.

I'm confused. I calculate the time complexity as the sum of the input size (n^2) and the number of steps, and the space complexity as the input size. (?) So its O(n^2) but I don't think I'm doing it right. And why should I divide it (help is from the teacher).

Can someone explain to me how to calculate this in this case?

ZlyVlk
  • 29
  • 7
  • *Math.pow(a, n)* takes O(logn) time complexity. If recursion takes place *n* times then it will take *O(n)* space complexity. I am leaving the rest to you. – Kangkan Lahkar Jan 04 '23 at 13:15
  • @KangkanLahkar Wrong, Math.pow takes O(1), see [this answer](https://stackoverflow.com/a/32419139/1488799). At any rate, Math.pow is not the most interesting piece here. – Gassa Jan 04 '23 at 16:47
  • @ZlyVlk A nice approach would be to express the time complexity T(n) in terms of T(n-1), where n is the size of the current matrix m. Then perhaps simplify the expression by repeatedly removing brackets. As for division, I'm puzzled... might be a typo and mean multiplication instead?.. – Gassa Jan 04 '23 at 16:52
  • @Gassa In the assignment, it says "divide", but that also seems strange to me. I wrote an email to the teacher but he hasn't replied yet. I don't know if I understand your method correctly, but I tried this: I think that j-cycle will run n-times, k-cycle (n-1)-times (because k!=i) and l-cycle (n-1)-times (l!=j), so it n(n-1)(n-1)=n^3 -2n^2 +n and it's O(n^3). Do you think that is correct? – ZlyVlk Jan 06 '23 at 12:53
  • Up to `n^3`, right. You should not care about the lower terms though, `O(n^10)` is the same as `O(n^10 + 100 n^9)`. – Gassa Jan 06 '23 at 13:26
  • But then, what's the reason to say "the _whole thing_ is `O(n^3)`" and stop? – Gassa Jan 06 '23 at 13:26

1 Answers1

1

Let us see. For an input matrix of size n, denote the time complexity as T(n).

So, for a matrix of size n:

  • if n = 1, we have the answer right away: T(1) = O(1);

  • otherwise, we loop over j for a total of n times, and for each j, we:

    • construct a minor in O(n^2) (the minor function), and then

    • run the function recursively for that minor: this one takes T(n-1) time.

Putting it all together, we have T(1) = O(1) and T(n) = n * (n^2 + T(n-1)).

To understand what is going on, write down what is T(n-1) there:

T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2)))

And then, do the same for T(n-2):

T(n) = n * (n^2 + T(n-1)) =
= n * (n^2 + (n-1) * ((n-1)^2 + T(n-2))) =
= n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) = ...

Then we write what is T(n-3), and so on. Now, open the brackets:

T(n) = n * (n^2 + (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1) * ((n-1)^2 + (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2) * ((n-2)^2 + T(n-3)))) =
= n^3 + n * (n-1)^3 + n * (n-1) * (n-2)^3 + n * (n-1) * (n-2) * T(n-3)))) =...

As we can see going further, the highest term among these, in terms of n, will be

T(n) = n * (n-1) * (n-2) * ... * 3 * 2 * 1^3, which is just n! (n-factorial).


The above is about time complexity. To address space complexity, consider how recursion uses memory. Note that it is much different from what happens to the time. The reason is that, at each point of time, we are in some single particular branch of the recursion tree, and at this time other branches don't use the memory they need.

Therefore, unlike with time, we can't just add up the total memory used by all recursive calls. Instead, we have to consider all branches of the recursion tree, and find the one that uses maximum memory. In this particular case, it is just any deepest branch of the recursion.

Denote the memory consumption by M(n) where n is the matrix size.

  • if n = 1, we have the answer right away: M(1) = O(1);

  • otherwise, we loop over j for a total of n times, and for each j, we:

    • construct a minor that takes O(n^2) memory (the minor function), and then

    • run the function recursively for that minor: this one takes M(n-1) memory.

Note that the loop does not accumulate memory used by minors. Instead, when we construct the minor for next j, the minor for previous j is not needed anymore. Thus we have M(n) = n^2 + M(n-1).

Again, writing down what is M(n-1), then M(n-2), and so on gets us to

M(n) = n^2 + (n-1)^2 + (n-2)^2 + ... + 3^2 + 2^2 + 1^2, which is O(n^3) with some constant factor we need not care about.


So, by the above reasoning, the answer is: T(n) = O(n!) and M(n) = O(n^3).
What's the hint about, with dividing by n^2, I don't have a clue, sorry!

Gassa
  • 8,546
  • 3
  • 29
  • 49
  • O, I see. Thank you very much! I'll see, maybe the teacher will explain the division to me. Thanks again. – ZlyVlk Jan 06 '23 at 15:01