I am trying to learn about time complexity of an algorithm. My professor has pushed it beyond Big O and wants us to be able to derive an algorithm to a mathematical function. I am having a hard time conceptualizing how this is done and was looking for help. In my class notes, a selection sort algorithm was provided (as shown in the code below). The notes asked the following question: "Derive a function f(n) that corresponds to the total number of times that minIndex or any position of nums is modified in the worst case. Now the notes tell me the answer is f(n)= 1/2n^2 + 5/2n + 3. I was wondering if anyone could explain how this occurs.
My professor told us to count the operations in the inner loop and work our way out. so I believe that at worst case in the inner loop the if statement always executes, this would mean we run the loop n-i-1 times, I get these values by taking n (the boundary that the for loop has to be less than and subtracting it by the starting condition (i+1). Then I look at the outer loop and I see it goes from i until it reaches n-1, so it could be written as (n-1)-i or just like the inner loop, n-i-1. Looking further there are three modifications in the outer loop so we get (n-i-1)+3 ( could I write it as (n-i+2)?
The number of modification at the worst case for the inner loop: n-i-1
The number of modifications at the worst case for the outer loop: (n-i-1)+3
now I would like to know how do you go from counting the two different modifications done and becoming f(n)= 1/2n^2 + 5/2n + 3.
public static void selectionSort(int[] nums) {
int n = nums.length;
int minIndex;
for(int i = 0; i < n-1; i++) {
//find the index of themin number.
minIndex = i;
for(int j = i+1; j < n; j++) {
if(nums[j] < nums[minIndex]) {
minIndex = j;
}
int temp = nums[i];
nums[i] = nums[minIndex];
nums[minIndex] = temp;
}
}
}