The claim on the Wikipedia page for binary heaps is that insertion is O(log n) in the worst case, but O(1) on average:
The number of operations required depends only on the number of levels the new element must rise to satisfy the heap property, thus the insertion operation has a worst-case time complexity of O(log n) but an average-case complexity of O(1).
The linked page attempts to justify this as follows:
However, on average, the newly inserted element does not travel very far up the tree. In particular, assuming a uniform distribution of keys, it has a one-half chance of being greater than its parent; it has a one-half chance of being greater than its grandparent given that it is greater than its parent; it has a one-half chance of being greater than its great-grandparent given that it is greater than its parent, and so on [...] so that in the average case insertion takes constant time
This is surely nonsense, though? It seems to me that if the tree were randomly ordered then there would be a 50/50 chance that a new element was greater than its parent; but that since, roughly speaking, the large elements sink to the bottom, the chances are much less than 50/50 as the heap grows.
Is that right?
It's been like that on Wikipedia for several months...