This kinda looks like a test question, so instead of just saying the answer, allow me to explain what each of these algorithmic complexity concepts mean.
Let's start with a claim function f(n) runs in constant time
. I am aware it's not even mentioned in the question, but it's really the basis for understanding all other algorithmic complexities. If some function runs in constant time, it means that its runtime is not dependent on its input parameters. Note that it could be as simple as print Hello World
or return n
, or as complex as finding the first 1,000,000,000 prime numbers (which obviously takes a while, but takes the same amount of time on every invocation). However, this last example is more of an abuse of the mathematical notation; usually constant-time functions are fast.
Now, what does it mean if a function f(n)
runs in amortized constant time? It means that if you call the function once, there is no guarantee on how fast it will finish; but if you call it n
times, the sum of time spent will be O(n)
(and thus, each invocation on average took O(1)
). Here is a lengthier explanation from another StackOverflow answer. I can't think of any extremely simple functions that run in amortized constant time (but not constant time), but here is one non-trivial example:
called = 0
next_heavy = 1
def f(n):
called += 1
if called == next_heavy:
for i in range(n):
print i
next_heavy *= 2
On 512-th call, this function would print 512 numbers; however, before that it only printed a total of 511, so it's total number of prints is 2*n-1
, which is O(n)
. (Why 511? Because sum of powers of two from 1 to 2^k
equals 2^(k+1)
.)
Note that every constant time function is also an amortized constant time function, because it takes O(n)
time for n
calls. So non-amortized complexity is a bit stricter than amortized complexity.
Now your question mentions a function with amortized logarithmic time, which similarly to above means that after n
calls to this function, the total runtime is O(nlogn)
(and average runtime per one call is O(logn)
). And then, per question, this function is called in a loop from 1 to N
, and we just said that by definition those N
calls together would run in O(NlogN)
. This is linearithmic.
As for the second part of your question, can you deduce what's the total running time of the loop based on our previous observations?