3

I am trying to figure out what the time complexity of this simple program is, but I can't seem to understand what would be the best way to do this.

I have written down the time complexity side by side for each line

1    public int fnA (int n) {                   
2      int sum = 0;                    O(1)
3      for (int i = 0; i < n; i++) {   O(n)
4        int j = i;                    O(n)
5        int product = 1;              O(1)
6      
7        while (j > 1) {               O(n)
8          product ∗= j;               O(log n) 
9          j = j / 2;                  O(log n)
10       }
11       sum += product;               O(1)
12     }
13     return sum;                     O(1)
14   }

Am I correct to assume these running times and that the final running time is: O(n)

If not, would somebody be able to explain where it is I am going wrong?

Overall:

1 + n + n + 1 + n + logn + logn + 1 + 1
= 3n + 2logn + 4

Final: O(n)
RandomMath
  • 675
  • 15
  • 33

4 Answers4

2

Time complexity for that algorithm is O(NlogN).

The for loop is executed N times (from 0 to N).

The while loop is executed logN times since your are dividing the number to half each time.

Since your are executing the while inside the for, your are executing a logN operation N times, from there it is the O(NlogN).

All remaining operations (assign, multiplication, division, sum) you can assume that takes O(1)

Arturo Menchaca
  • 15,783
  • 1
  • 29
  • 53
2

The crux of the above program is the while loop and it is the defining factor and rest of the lines will not have complexity more than O(n) and assuming that arithmetic operations will run in O(1) time.

while (j > 1) {              
  product ∗= j;             
  j = j / 2;                         
}

The above loop will have a run time of O(log(j)) and j is varying from 1 to n, so its the series...

-> O(log(1) + log(2) + log(3) + log(4).....log(n))
-> O(log(1*2*3*4...*n))
-> O(log(n!))

and O(log(n!)) is equal to O(n log(n))

For the proof for above refer this

Community
  • 1
  • 1
Binary Baba
  • 1,953
  • 2
  • 16
  • 23
1

No for every i, there is logn loop running and hence for n elements the total complexity is nlogn.

Since you know that the following loop takes logn .

while (j > 1) {              
        product ∗= j;             
       j = j / 2;                         
}

Now this particular loop is executed for every i. And so this will be executed n times. So it becomes nlogn.

Priyansh Goel
  • 2,660
  • 1
  • 13
  • 37
0

To start with, you could count all operations. For example:

1    public int fnA (int n) {                   
2      int sum = 0;                    1
3      for (int i = 0; i < n; i++) {   
4        int j = i;                    n
5        int product = 1;              n
6      
7        while (j > 1) {               
8          product ∗= j;               ?
9          j = j / 2;                  ?
10       }
11       sum += product;               n
12     }
13     return sum;                     1
14   }

Now we could do the counting:equition which sums up to: 2 + 3n + nlog(n)

In a lot of programs the counting is more complex and usually has one outstanding higher order term, for example: 2+3n+2n2. When talking about performance we really care about when n is large, because when n is small, the sum is small anyway. When n is large, higher order term drawf the rest, so in this example 2n2 is really the term that matters. So that's the concept of tilde approximation.

With that in mind, usually one could quickly identify the portion of code that gets executed most often and use its count to represent overall time complexity. In example given by OP, it would look like this:

for (int i = 0; i < n; i++) {
    for (int j = i; j > 1; j /= 2)
        product *= j;
}

which gives ∑log2n. Usually the counting involves discrete mathamatics, one trick I have learned is to just replace with it integral and do caculus: ∫ log2n = nlog(n)

Tianwei Chen
  • 148
  • 11