We're supposed to "consider the following algorithm, which operates on an array A[1 . . n] of integers."
for i in [1 . . n] do
A[i] ← 0
for i in [1 . . n] do
j ← i
while j ≤ n do
A[j] ← A[j] + 1
j ← j + i
The assignment asks us to demonstrate that this algorithm runs in O(n log n).
The first loop is quite clearly going to add n to the run time, which would simply be dropped.
The second nested loops will run faster than a pure O(n^2) algorithm, since the while loop doesn't always run n times. When i = 1 it goes n times, i = 2 it will run n-1 times, all the way up to i = n where it will run once.
But, using the same method as Gauss summing the integers between 1 and 100, we can see that the while loop will run an average of (n+1)/2 times. Multiply by n for the for loop, and we get to (n^2 + n)/2, which can be simplified down to O(n^2), not O(n log n)
How does this result in an O(n log n) running time?