I have heard that there is no faster algorithm faster than linear search (for an unsorted array), but, when I run this algorithm (linear):
public static void search(int[] arr, int value){
for(int i = 0; i < arr.length; i++){
if(arr[i] == value) return;
}
}
With a random array of length 1000000, the average time to find a value is 75ns, but with this algorithm:
public static void skipSearch(int[] arr, int value){
for(int i = 0; i < arr.length; i+=2){
if(arr[i] == value) return;
}
for(int i = 1; i < arr.length; i+=2){
if(arr[i] == value) return;
}
}
I get a shorter average, 68ns?
Edit: A lot of you are saying that I didn't do a proper benchmark and this was by fluke, but I ran these functions 1000000 times and got the average. And every time I ran the functions 1000000 times, I got 75-76ns for the first algorithm, and 67-69ns for the second algorithm.
I used java's System.nanoTime()
to measure this.
Code:
int[] arr = new int[1000];
Random r = new Random();
for(int i = 0; i < arr.length; i++){
arr[i] = r.nextInt();
}
int N = 1000000;
long startTime = System.nanoTime();
for(int i = 0; i < N; i++){
search(arr, arr[(int) Math.floor(Math.random()*arr.length)]);
}
System.out.println("Average Time: "+(System.nanoTime()-startTime)/(float)N+"ns");
startTime = System.nanoTime();
for(int i = 0; i < N; i++){
skipSearch(arr, arr[(int) Math.floor(Math.random()*arr.length)]);
}
System.out.println("Average Skip Search Time: "+(System.nanoTime()-startTime)/(float)N+"ns");