What will be run-time analysis in terms Big O?
int gcd(int n, int m) {
if (n%m ==0) return m;
if (n < m) swap(n, m);
while (m > 0) {
n = n%m;
swap(n, m);
}
return n;
}
The answer given is O(log N). But I'm not able to figure out solution. Can anyone plz? Thanks in advance.
PS: I know lots of people are saying this is a duplicate problem and see posted solution earlier. Well, I got the solution even before posting here. I asked for a "SIMPLE" explanation to such solutions.