I have written the following Java code for calculating the number of 1:s found in a bit vector using a Divide and Conquer method. I'm trying to figure out the time complexity of the algorithm "count" assuming we have a function IsZero(int []arr, int low, int high)
that (perhaps unrealistically) runs in O(1) which returns true if all the bits between low and high are 0. I will include a main method which explains how the algorithm can be used, and also my own implementation of IsZero (which doesn't run in O(1) complexity). So, the question is. What is the time complexity of this algorithm given that IsZero has O(1) complexity?
Class Algorithm {
static int NumberOfOnes(int v[]) {
return count(v, 0, v.length-1);
}
static boolean IsZero(int arr[], int low, int high){
for(int i = low; i <= high; i++){
if(arr[i] == 1){
return false;
}
}
return true;
}
static int count(int arr[], int low, int high){
if (low == high && arr[low] == 1) {
return 1;
}
if (high - low == 1) {
return arr[low] + arr[high];
}
if(IsZero(arr, low, high)) {
return 0;
}
int mid = (low + high) / 2;
return count(arr, low, mid) + count(arr, mid + 1, high);
}
public static void main(String args[]) {
int arr[] = {1,1,0,0,0,0,1,0,1,1};
int k = NumberOfOnes(arr);
System.out.println(k); //Should print 5 as result
}
}