0

This is the method written in python:

def count_con(s, lo, hi):
    vls = ['a', 'e', 'i', 'o', 'u']
    if lo == hi:
        if s[lo] not in vls:
            return 1
        else:
            return 0

    mid = (lo + hi) // 2

    count_l = count_con(s, lo, mid)
    count_r = count_con(s, mid + 1, hi)

    return count_l + count_r

Assume we want to find its time complexity. I've come up with a recurrence relation:

T(n) = 2T(n/2) + f(n), if n > 1 or T(n) = O(1), if n <= 1

However, I cannot determine what f(n) would be. Will combining the two halves take linear O(n) time just like in merge-sort or constant O(1) time? I tend to think that since it's only an addition constant O(1) time is more suitable.

And overall, is this method O(n*logn) or O(logn)?

Francesco Montesano
  • 8,485
  • 2
  • 40
  • 64
Paris
  • 105
  • 5
  • 4
    It cant be better than O(N). You are looking at every single char after all. – user2390182 Feb 16 '21 at 10:19
  • Why are you doing it with a recursive function? – 永劫回帰 Feb 16 '21 at 10:21
  • And indeed, with master theorem T(n) = 2T(n/2) + O(1) is equal to O(n). See https://en.wikipedia.org/wiki/Master_theorem_(analysis_of_algorithms). See also https://stackoverflow.com/questions/25858500/solving-recurrence-tn-2tn-2-%CE%981-by-substitution. BTW, it is a terrible idea to solve this problem with recursion. – Lesiak Feb 16 '21 at 10:25
  • I tried recursion only for educational purposes. Thanks for the tip. – Paris Feb 16 '21 at 10:40

2 Answers2

1

Will combining the two halves take linear O(n) time just like in merge-sort or constant O(1) time? I tend to think that since it's only an addition constant O(1) time is more suitable.

You are correct.

So if we assume one call does 1 unit of work (ignoring the recursive calls), all we have to do is establish the number of calls. If we assume that n = 2k we end up with 1 + 2 + 4 + ... + 2k = 2n - 1 calls. Which is just O(n).

orlp
  • 112,504
  • 36
  • 218
  • 315
1

As mentioned in your question, combining the halves consumes only constant time. What the code does is, simply recursively check (access) an index if its a vowel and increment the counter. Its overall only O(n).

If you need some insight on when logarithmic complexities come into picture, you can see this question to understand time complexities in general : here

f(n) as mentioned in your question would take only constant time here, but in mergesort that's an O(n),because after the recursive calls,before the sorted arrays are returned, you are accessing and linearly arranging the values of the entire array between indices left and right.

Vijay
  • 144
  • 9