This is the code I have been given but I cannot decide whether it is O(log(n)) or O(n).
int i=n;
while (i > 0) {
i/=2;
}
This is the code I have been given but I cannot decide whether it is O(log(n)) or O(n).
int i=n;
while (i > 0) {
i/=2;
}
Lets assume n = 1000
.
How many iteration it'll take until i = 0
?
Each time you divide it by 2. So we'll get the following table:
Iteration | i
----------|--------
0 | 1000
1 | 500
2 | 250
... | ...
... | ...
10 | 0 <-- Here we stop
Does this help you to figure out the complexity? (It should - Hint: What is ~log(1000) and what does O(n) mean?)