I came upon the recursive relation for the max-heapify algorithm when going through CLRS. My teacher had justified, quite trivially in fact, that the time complexity of the max-heapify process was O(logn), simply because the worst case is when the root has to 'bubble/float down' from the top all the way to the last level. This means we travel layer by layer, and hence the number of steps equals the number of levels/height of the heap, which, as we know, is bounded by logn. Fair enough.
The same however was proven in CLRS in a more rigorous manner via a recurrence relation. The worst case was said to occur when the last level is half filled and this has already been explained here. So as far as I understand from that answer, they arrived at this conclusion mathematically: we want to maximise the size of the left subtree relative to the heap size n i.e to maximise the value of L/n. And to achieve this we have to have the last level half filled so that the number of nodes in L (left subtree) is maximized and L/n is maximized.
Adding any more nodes to the last level will increase the number of nodes but bring no change to the value of L. So L/n will decrease, as heap becomes more balanced. All is fine and good as long as it's mathematical.
Now this is where I get stuck: Let's say I add one more node in this half-filled level. Practically, I fail to see how this somehow reduces the number of steps/comparisons that occur and is no longer the worst case. Even though I have added one more node, all the comparisons occur only in the left subtree and have nothing to do with the right subtree. Can someone convince me/help me realise why and how exactly does it work out that L/n has to be maximized for the worst case? I would appreciate an example input and how adding more nodes no longer makes it the worst possible case?