I have a question about the Euclid's Algorithm for finding greatest common divisors.
gcd(p,q) where p > q and q is a n-bit integer.
I'm trying to follow a time complexity analysis on the algorithm (input is n-bits as above)
gcd(p,q)
if (p == q)
return q
if (p < q)
gcd(q,p)
while (q != 0)
temp = p % q
p = q
q = temp
return p
I already understand that the sum of the two numbers, u + v
where u
and v
stand for initial values of p
and q
, reduces by a factor of at least 1/2
.
Now let m
be the number of iterations for this algorithm.
We want to find the smallest integer m
such that (1/2)^m(u + v) <= 1
Here is my question.
I get that sum of the two numbers at each iteration is upper-bounded by (1/2)^m(p + q)
. But I don't really see why the max m
is reached when this quantity is <= 1
.
The answer is O(n) for n-bits q
, but this is where I'm getting stuck.
Please help!!