0

I am having a dispute with a co-worker regarding a Big O question. For example, consider the following Python for-loop that prints every 100th element:

n = 10000
for i, x in range(0, n, 100):
    print(i)

I think the complexity is O(log n) as opposed to O(n), because it grows not quite linearly. O(n) is not wrong, I think, but isn't O(log n) more precise?

Thanks!

resnet
  • 103
  • 2
  • 3
    Its still O(n) because it does scale linearly with `n` – That1Guy Oct 16 '20 at 15:16
  • To be nit-picky, O(n) is actually wrong. It's O(n log n), since `print(i)` is O(log i) (it's impossible to print `k` characters in less than `k` time), so the whole thing is bounded by O(Sum_i(log i)) = O(log n!), which is O(n log n). – Mo B. Oct 16 '20 at 21:54

1 Answers1

0

First of all, if it is actually O(log n), then O(n) is in fact wrong, not just imprecise. Also, it is O(n), since it does scale linearly. Since you step by 100, you have n/100 iterations of the loop, which is linear with respect to n.

Aplet123
  • 33,825
  • 1
  • 29
  • 55