I created a C++ program that outputs the input size vs. execution time (microseconds) of algorithms and writes the results to a .csv file. Upon importing the .csv in LibreOffice Calc and plotting graphs,
I noticed that binary search for input sizes upto 10000 is running in constant time even though I search for an element not in the array. Similarly, for upto the same input size, merge sort seems to run in linear time instead of the linear-logarithmic time it runs in in all cases.
Insertion Sort and Bubble Sort run just fine and the output plot resembles their worst case quadratic complexity very closely.
I provide the input arrays from a file. For n = 5, the contents of the file are as follows. Each line represents an input array:
5 4 3 2 1
4 3 2 1
3 2 1
2 1
1
The results.csv file on running insertion sort is:
Input,Time(ms)
5,4
4,3
3,2
2,2
1,2
The graph of binary search for maximum input 100 is here.
Also the graph of merge sort for maximum input 1000 is here which looks a lot like it is linear (the values in the table also suggest so).
Any help as to why this is happening will be greatly appreciated.
Here is a link to the github repository for the source code: https://github.com/dhanraj-s/Time-Complexity