0

I know there are many questions that are answered on this topic. But I am not getting what they are saying.
Specifically, my question is why the worst case happens when the bottom level is exactly half full and why not in full?
enter image description here

I have browsed through this questions :
Worst case in Max-Heapify - How do you get 2n/3?
worst case in MAX-HEAPIFY: “the worst case occurs when the bottom level of the tree is exactly half full”

But my question is why we use term 'half full' when we can have worst-case when nodes in the tree are at its maximum?
To support my point I have attached an image. Now height of A is 3 and the height of B is also 3. But number of times we call heapify will increase because now A will call heapify for n/2 which is 11/2 ~ 5 and in B we call heapify for 15/2 ~ 7 inside the main loop. Should this be a worst-case?
I am sure that I am somewhere wrong in this intuition but don't know where.

Yash Patel
  • 41
  • 3

1 Answers1

2

The "worst case" here means the case where the larger subtree is largest relative to n.

In your figure (A), the tree has eleven nodes (n = 11), of which seven belong to the larger subtree, so the larger subtree has 7n/11 ≈ 0.636n nodes.

In your figure (B), the tree has fifteen nodes (n = 11), of which seven belong to each subtree, so each subtree has 7n/15 ≈ 0.467n nodes.

So although the larger subtree in figure (A) and that in figure (B) have the same absolute number of nodes (namely 7), the former is larger than the latter relative to n, because the latter has a larger n.

ruakh
  • 175,680
  • 26
  • 273
  • 307
  • What this `largest relative to n` information give us? Is it safe to say that there is no particular worst case since the position of elements in the tree is strictly regulated be as complete tree as possible? Time is affected by the quantity of `n` rather than the arrangement of `n`? – Yash Patel Jul 24 '20 at 04:48
  • @YashPatel: When analyzing an algorithm or data structure, we analyze its worst case because that gives us a *guarantee*: no other case is worse. In the case of this heap structure, this "worst case" analysis tells us that the size of a subtree is *at most* two-thirds the total size of the tree. – ruakh Jul 24 '20 at 05:15
  • Ok, let us assume that the left tree has a two-third size of whole tree. So this is the worst case. Now if we call heapify on root node and node seems to go through the right sub-tree then it will perform one less recursive step than left subtree. But we can have more better worst case when node is passed through left subtree. Generally we make worst case on input like its arrangement, distribution, etc not on the value of n. But in this we are focusing on n by making the last level half full. WHy? – Yash Patel Jul 24 '20 at 14:29
  • @YashPatel: A heap is a data structure with the APIs "insert element", "check if empty", "retrieve greatest element", and "remove greatest element". The implementation of that data structure -- nodes, arrays, etc. -- doesn't figure into those API definitions. There many different kinds of data structures that can support those APIs in different ways, and with different performance characteristics. So the number of elements in the data structure at any given time is a fundamental characteristic of the use-case, and the only way we can compare the different data structures *[continued]* – ruakh Jul 24 '20 at 16:48
  • *[continued]* that offer those APIs is to quantify its worst-case performance for a given number of elements. – ruakh Jul 24 '20 at 16:49