-1

So I've been stuck on this problem for quite a while now and I figured, I can get some support from this community as its my last resort

Algorithm gibby(A, B, n)

Input: arrays of integers, A and B, both of length n 
Output: integer value 

lal := 0  
for i := 0 to n-1 
    for j := 0 to n-1 
        lal := lal + (A[i] * B[j]) 
   endfor 
endfor 
return lal 

I'm I right thinking that this has a time complexity of 0(N^2), if I'm mistaken please elaborate as this would be greatly appreciated.

Also how can I create another algorithm that computes exactly the same thing as the algorithm above but has a time complexity of 0(N)?

Thank you in advance.

Pioneer
  • 1
  • 3

2 Answers2

1

Your complexity analysis is correct since you are nesting two iteration over the two arrays.

Regarding creating an algorithm in linear time, O(N), you can utilise the mathematical properties of multiplication and addition. The commutative, associative and distributive property allows us to reorder the calculations you want to do.

For example, with n=4, and input arrays with:

A=[a][c][e][g]
B=[b][d][f][h]

Your algorithm would perform these calculations:

i = 0 and j = 0..3: ab + ad + af + ah = a(b+d+f+h)
i = 1 and j = 0..3: cb + cd + cf + ch = c(b+d+f+h)
i = 2 and j = 0..3: eb + ed + ef + eh = e(b+d+f+h)
i = 3 and j = 0..3: gb + gd + gf + gh = g(b+d+f+h)

If you take the equivalent expressions and again simplify the expression:

a(b+d+f+h) + c(b+d+f+h) + e(b+d+f+h) + g(b+d+f+h)

You get:

(b+d+f+h)(a+c+e+g)

Which is the multiplication of the sum of the individual arrays. This will get you the same result, but can be implemented with a linear time algorithm. Using your pseudo code syntax the algorithm would look like this:

suma := 0
sumb := 0  
for i := 0 to n-1 
    suma := suma + A[i] 
    sumb := sumb + B[j]  
endfor 
return suma*sumb 
sery
  • 186
  • 2
  • 3
0
  1. Yes, the time-complexity (upper bound) of your algorithm is O(n^2). You have two nested for loops running n times each.
  2. In the worst case, the total number of primitive operations lal := lal + (A[i] * B[j]) are n X n = n^2. This is the same as worst case time complexity discussed in point 1.

P.S. You might want to read a few chapters of Introduction to Algorithms by Thomas H. Cormen. It explains the fundamentals of time-complexity. Every programmer should read this.

VHS
  • 9,534
  • 3
  • 19
  • 43