When an O(1) amortized step is performed n times, it is not valid to conclude the total cost is just O(n) amortized. The fact that a step is O(1) amortized means its average cost for n times is at most some constant c, and the fact that its average cost is at most c implies the total cost for those n steps is at most cn, so the cost for n steps is O(n), not just O(n) amortized.
By the definition of amortized cost with the aggregate method, the fact that a operation is T(n)/n amortized means there is some upper bound T(n) on performing n operation. So, if an operation is O(1) amortized, meaning there is some c such that the average cost is at most c, we have T(n)/n ≤ c, and therefore T(n) ≤ cn, and therefore performing n operation has at most cn cost. Therefore, the cost of n operations is O(n), not just O(n) amortized.
There can be some confusion in considering operations in isolation rather than as part of a sequence of n operations. If some program executes billions of unordered_set
insertions and we take a random sample of n of them, it is not guaranteed that those n have an O(1) amortized time. We could have been unlikely to get many of the insertions that happened to be rebuilding the table. In such a random selection, the statistical average time would be O(1), but each sample could fluctuate. In contrast, when we look at all the insertions to insert n elements in the table, their times are correlated; the nature of the algorithm is such that it guarantees the table rebuilds occur only with a certain frequency, and this guarantees the total amount of work to be done over n insertions is O(n).