7
    int func(int n){
       if(n==1)
         return 0;
       else
         return sqrt(n);
    }

Where sqrt(n) is a C math.h library function.

  1. O(1)
  2. O(lg n)
  3. O(lg lg n)
  4. O(n)

I think that the running time entirely depends on the sqrt(n). However, I don't know how this function is actually implemented.

P.S. The general approach towards finding the square root of a number that I know of is using Newton's method. If I am not wrong, the time complexity using Newton's method turns out to be O(lg n). So should the answer be O(lg n)?

P.P.S. Got this question in a recent test that I appeared for.

amit
  • 175,853
  • 27
  • 231
  • 333
Ishit Mehta
  • 89
  • 1
  • 2
  • 6
  • 10
    It's going to depend on the implementation of `sqrt()`. – amit Apr 16 '14 at 08:24
  • Maybe [this question](http://stackoverflow.com/questions/6884359/c-practical-computational-complexity-of-cmath-sqrt) is helpful? – lethal-guitar Apr 16 '14 at 08:26
  • 5
    And what do you think is the right answer (and explain your thoughts)? Then we can help guide you towards the right answer (if you're wrong) – Mats Petersson Apr 16 '14 at 08:26
  • @lethal-guitar: That doesn't REALLY answer the question. – Mats Petersson Apr 16 '14 at 08:30
  • @lethal-guitar you have taken a test of what ? if it is computer science then it depends on your configuration i.e. cpu/compiler/(sqrt' implementation), if it is math, then you have to know what approximation is used to calculate sqrt on cpu. – deimus Apr 16 '14 at 08:34
  • @MatsPetersson I'm sorry, I'm not sure to which of my comments you're referring? I deleted one of them. – lethal-guitar Apr 16 '14 at 08:36
  • 1
    **Voting to reopen** - the linked question deals mainly with empirical results and implementations on cpus rather than with the theoretic aspect, which is the focus of this question. – amit Apr 16 '14 at 08:58
  • You answered the question - *Time complexity* in this case depends on actual implementation of library function sqrt(n). Which is, in case Newton's method is used = O(lg n). – SChepurin Apr 16 '14 at 09:38
  • @lethal-guitar sorry I mixed your nickname with the OPs name :) – deimus Apr 16 '14 at 10:48
  • @deimus oh I see. Deleted my comment – lethal-guitar Apr 16 '14 at 12:29

3 Answers3

9

I am going to give a bit more general case answer, without assuming constant size of int.

The answer is Theta(logn).

We know newton-raphson is Theta(logn) - that excludes Theta(n) (assuming sqrt() is as efficient as we can).

However, a general number n requries log_2(n) bits to encode - and you require to read all of it in order to get an accurate sqrt() function. This excludes Theta(1) and Theta(log(log(n)).

From the above, we know that the complexity of the function is Theta(log(n)).

As a side note, since O(log(n)) is a subset of O(n) - it is also a valid answer, though not tight one. For more information about big Theta and big O and their differences, you might want to have a look on this thread.

Community
  • 1
  • 1
amit
  • 175,853
  • 27
  • 231
  • 333
  • Will the downvoter please state the fault of the answer? – amit Apr 16 '14 at 08:39
  • Okay, I understand that it requires log_2 (n) bits to encode the number. But wouldn't that imply that all the functions irrespective of if it takes constant time in theory, it will take at least O(lg n) time on the computer, because we need to encode the information? – Ishit Mehta Apr 16 '14 at 08:51
  • 2
    @IshitMehta In theory, a lot of things that are generally regarded as `O(1)` are `O(logN)` when dealing with unbounded input size. However, many time we can bound the input size (int is only 32-64 bits, for example...) My answer was stating it is a general case answer. There are some things that are still `O(1)` regardless of the input size like adding an element to a linked list, where the element does not have to be copied -only the reference to it (which is of bounded size..) – amit Apr 16 '14 at 08:54
  • Nevermind, it's @Danvil downvoted a competing answer. – amit Apr 16 '14 at 09:23
  • @amit: I did not downvote your answer, because it is "competing". You are making it a bit easy for yourself. First, the stated question is about "O" not "Theta". No matter what `int` you choose, the runtime of `sqrt` will always be smaller than a given constant. n is *not unbounded* in the given question. Additionally for the general case with an unbounded integer, you answer is not correct. See the link given in my answer. – Danvil Apr 16 '14 at 10:10
4

This depends on the implementation of sqrt and also on what kind of time complexity you are interested.

I would say you can consider it to be "constant", so O(1), in that sense: If you put in a random int, it will in average take the same amount of time. (Reason: numbers with many digits are much more common).

But have a look here. Another possible answer is O(M(n)), where M(n) is the complexity of a multiplication and n is the number of digits in your integer.

This looking like a text-book question and a is perhaps meant to be a trap. The teacher perhaps wants to check if you can distinguish between computing sqrt for a list of numbers (which would be O(n)), and a single number (which would be O(1)).

Be aware that the "correct" answer often also depends on the context in which it is asked.

Danvil
  • 22,240
  • 19
  • 65
  • 88
  • So, will larger numbers take, on an average, the same amount of time as the smaller numbers will? – Ishit Mehta Apr 16 '14 at 08:34
  • @IshitMehta: No. In general large numbers will take longer than short ones. – Danvil Apr 16 '14 at 08:35
  • 1
    Then how is the time complexity O(1)? – Ishit Mehta Apr 16 '14 at 08:36
  • @IshitMehta: As I said, it depends on what "n" is. n=number of items => O(1) as there is only one number. n=number of digits => O(M(n)). Please read my whole answer. – Danvil Apr 16 '14 at 08:40
  • I think `n` is perfectly defined as the number, as it is the name of the argument itself. – amit Apr 16 '14 at 08:43
  • Apologies. English isn't my first language. Didn't mean to sound aggressive. – Ishit Mehta Apr 16 '14 at 08:43
  • 1
    @amit: So for n=654, the runtime would be O(lg 654)? ;) In my eyes it is ambiguous and possibly a trap. – Danvil Apr 16 '14 at 08:45
  • @Danvil Recall that big-O has very little meaning for small values of `n`, it deals with asymptotic bounds. However, generally - yes, it will behave 'similar' to log(654) to n=654, and similar to log(654e10) for n=654e10. – amit Apr 16 '14 at 08:47
  • @amit: And now look at the question. It is about the time complexity of `sqrt(int)`. An (32-bit) int has at most 10 digits! Again: The answer depends on the context of the question. – Danvil Apr 16 '14 at 08:52
  • 2
    so in most implementations of most algorithms the complexity is O(1), because the inputs is bounded by 32 or 64 bits? I dont think it works this way. Sure if your problem has natural limitations this is true. – AbcAeffchen Apr 16 '14 at 11:30
2

Let n=2^m
Given T(n)=T(sqrt(n))+1
T(2^m)=T(2^m-1)+1
Let T(2^m)=S(m)
then,
S(m)=2S(m/2)+1
using master theorem-
S(m)=theta(m)
=theta(log(n))
Hence, time complexity is theta(log(n)).

Himanshu
  • 19
  • 4