0

Here is my code for the sorting the list in ascending order. I have used function within the function. Now I want to calculate the Time complexity of this function. From my side I have calculate that the function "unite" is called every time when function "sort" complete its loop. So two functions are used every time in this function. So I have concluded that the Complexity of this function is O(nlog(n)). I am new to this chapter. So I want to know how to calculate this type of complexity. The answer above is just my approximation. And neither I know the real answer nor I have any solution or hints. So please describe your answer whenever you give. Thanks. Here is my code.

def sort(lst):
    def unite(l1, l2):
        if len(l1) == 0:
            return l2
        elif len(l2) == 0:
            return l1
        elif l1[0] < l2[0]:
            return [l1[0]] + unite(l1[1:], l2)
        else:
            return [l2[0]] + unite(l1, l2[1:])

    if len(lst) == 0 or len(lst) == 1:
        return lst
    else:
        front = sort(lst[:len(lst)/2])
        back = sort(lst[len(lst)/2:])

        L = lst[:]  # the next 3 questions below refer to this line
        return unite(front, back)
  • Whence cometh `sort4`? – Hyperboreus Dec 12 '13 at 19:44
  • Actually, it's at least O(n^2) because `unite` makes O(n^2) copies for inputs of size n1 + n2 = n. (And O(n log n) is not an approximation of that in any sense of the word.) –  Dec 12 '13 at 19:45
  • Oh but I have see couple of examples which have function calls with in the function. And they have the complexity of O(log(n)). – user3096874 Dec 12 '13 at 19:48
  • 1
    Just because you have function calls in a function doesn't mean that they're the same complexity as your examples. Runtime is based on what the functions do, not where they're constructed. There is a way to write your `unite` function that will allow it to run in `O(n)`, but you didn't do that. – AAA Dec 12 '13 at 19:51
  • You might benefit from reading some of the other questions (especially the high-voted ones) with the tags `time-complexity` and/or `big-o`. –  Dec 12 '13 at 19:56
  • My favourite source on solving the general problem is the highest-voted answer to [Big O, how do you calculate/approximate it?](http://stackoverflow.com/questions/3255/big-o-how-do-you-calculate-approximate-it). – Simon Dec 12 '13 at 20:00

3 Answers3

1

First step is to note that the real work is being done in the unite step of your code, which does n^2 work because you're creating new lists every time.

So you can actually write a quick recurrence for the amount of work your function is doing:

W(n) = 2W(n/2) + n^2

because you're recursing twice on lists that are of length n/2 and doing n^2 work to rejoin them.

Now, consider a recursion tree - at a certain level of the tree (call it level i), you're doing 2^i * (n/2^i)^2 work. That's about O(n^2) work at each level, and there's log(n) levels, so you're doing O(n^2log(n)) work.

However, there is a way to write your unite function so it runs much faster, in O(n) time. In that case, you'd be doing (by a similar analysis as above) O(nlog(n)) work.

AAA
  • 1,364
  • 1
  • 10
  • 15
0

The time for sort to execute on an n-element vector is T(n).

T(n) =  2 T(n/2) + n

     =  2 [2 T(n/4) + n/2] + n

     =  4 T(n/4) + 2n

     = 4 [2 T(n/8) + n/4] + 2n

     = 8 T(n/8) + 3n
     ........

     = 2k T(n/2k) + k n

     = 2log2 n T(1) + (log2n) n   because T(1)=O(1) 

     = n + n log2 n    

     = O(n log n) 

There is an easy way to remember the complexity for sort solution recursive function. T(selection sort) = O(n^2), and T(merge sort) = O(nlogn). Obviously, your code is merge sort type.

Q Yang
  • 310
  • 1
  • 3
  • 14
0

Time Complexity (not space complexity) of above code is O(nlog(n)).

There are n elements in the list at start and we recursively divide that list in n/2 elements every time into front and back which makes it O(log(n)) steps. Now, for each of O(log(n)) steps we iterate over each element in l1 and l2 in unite function only once which makes complexity of unite function O(n).

Hence, for O(log(n)) divisions and O(n) unite steps makes this algorithm O(nlog(n)) of time complexity.

Other answers are discussing about space complexity of unite function which is O(n^2), but question title clearly asks about time complexity and not about space complexity.

Rajesh Surana
  • 883
  • 1
  • 10
  • 15