*I misread the initial question and have edited my answer
First I'd try to simplify the code a little.
We know that in the expression (k < N) && (k > j)
, k
is always greater than j
since it is initialized to j+1
and we ignore overflow.
That gives us:
int i = 1;
while(i < N) // O(N)
i = i + 3; // O(1)
if(B[6] < 300) // Only consider what's inside in the worst case
for(int j = 1; j < N / 2; j++) // O(N)
int k = j + 1; // O(1)
while(k < N) // O(N)
printf("%d", B[k]); // O(1)
k++; // O(1)
We perform O(N) work inside the while(i < N)
loop and perform constant time work (O(1)
) inside it if we do not go into the if statement. We then multiply N * 1
to still get O(N)
This gives us the best case runtime of O(N).
Now for the worst case assume we also go inside the if statement. This means we execute the for loop which does O(N/2)
work which is still O(N)
. Inside the for loop is another while loop that performs O(N)
work as well.
We then multiply O(N)
from for(int j = 1; j < N / 2; j++)
and O(N)
from while(k < N)
since they're nested and get O(N^2)
.
We then add the initial O(N)
and the O(N^2)
to get O(N^2 + N)
which is really just O(N^2)
since we only care about the highest scaling term. This makes O(N^2)
the worst case runtime.