-1

I was trying to write an algorithm which, given a complete binary tree, tells me whether it's a binary search tree (BST) or not. I am required to provide an algorithm with O(n) time complexity, where n is the number of nodes in the tree.

I came up with this solution (pseudo code):

def isBST(x):
    if x=NIL: return True
    if min(x.right) < x.value or max(x.left) > x.value: return False
    isBST(x.left)
    isBST(x.right)
    return True

where min(X.right) is the minimum value in the right subtree defined by x:

def min(x):
    while x.left:
        x = x.left
    return x.value

And max(X.left) gets the maximum value in the left subtree.

I read online that this kind of solution has a running time of O(N) in the case of balanced/full tree, because the recursive equation will be:

      T(n) = T(n/2) + O(logn)

Is it true that this represents O(n) time complexity? If yes, why? if not, then how can I make it O(n)?

trincot
  • 317,000
  • 35
  • 244
  • 286
DR_2001
  • 29
  • 4
  • How is `min` implemented? – trincot Dec 15 '22 at 18:00
  • Max is implented by going to the direct rightmost descendant and min do the leftmost descendant. Both cost log(H) when H is the height of the tree who’s root is the node given – DR_2001 Dec 16 '22 at 01:07
  • If you are not sure about the complexity of your algorithm, start with some tests for different n (e.g n: 32, 64, 128, ..) and count the operations in your code to get a first idea. – MrSmith42 Dec 16 '22 at 08:22

2 Answers2

2

First, I should mention there is a bug in your function: it does not do anything with the booleans that are returned by the recursive calls, so that even if one of those returns false, the execution continues and returns true.

The correct code would process the recursive calls like this:

    return isBST(x.left) and isBST(x.right)

The time complexity is O() in the worst case, i.e. when the given tree is indeed a BST.

The best case is O(log) when the given tree is not a BST and the first execution of the if-condition is true and the execution ends. One call of min() is made which is responsible for that O(log).

When the given tree is a BST, all nodes take a turn to become the argument of the function, and it is also called O() times with NIL as argument, twice for every leaf node (left and right).

The more difficult part of the analysis concerns the calls of min() and max(). These visit ℎ nodes where ℎ is the height of the given node. At first glance this looks like it would make the time complexity worse than O(), but for leaves, the value of ℎ is 0, and that is true for half of the nodes, while the other extreme (the root) will have ℎ equal to the height of the tree, but there is just one root.

This is similar to how a heap can be built with O() time complexity, using Floyd's heap construction algorithm which -- just like here -- involves a logℎ step for each node. This turns out to still make the overall operation O(). I refer to How can building a heap be O(n) time complexity? for lots of answers that explain why this still is O(). The reasoning is the same for this algorithm.

trincot
  • 317,000
  • 35
  • 244
  • 286
0

Your equation for time complexity should be T(n) = 2T(n/2) + O(log(n)). Because in your function, you have some constant checking conditions (if) in addition to finding min and max in each subtree (log(n) in the worst case) and 2 recursive calls on the left and right child of the root. Now, what is the complexity of T(n)? You are right! T(n) is in O(n). You can approve it by the master theorem.

Also, if for finding min and max just search all member of subtrees, the time complexity will change to O(nlog(n)), because the recurrent will change to T(n) = 2T(n/2) + O(n)

OmG
  • 18,337
  • 10
  • 57
  • 90
  • I think you're leaving out the cost of finding the min and max value in each subtree, which the OP seems (?) to be doing by walking from the node down to its leftmost / rightmost descendants. – templatetypedef Dec 15 '22 at 22:47
  • @templatetypedef yes that’s how I am doing it, sorry I forgot to mention it. Is the complexity still O(n) in a full tree? – DR_2001 Dec 16 '22 at 01:09
  • @templatetypedef thanks, you are right. It's addressed already. – OmG Dec 16 '22 at 14:25
  • Correct me if I’m mistaken, but does the Master Theorem apply here? You have b = 2, a = 1 so log_b a = 0 and then the log term isn’t bounded by O(1). – templatetypedef Dec 16 '22 at 15:30
  • @templatetypedef `b = 2`, but `a = 2` too, as we have `2T(n/2)`. Right? In that case, `log_b a = 1`. – OmG Dec 16 '22 at 15:57