I am trying to bin a raw data to make a histogram. The raw data has been saved in an array named data[k] (please refer to the code below). I have specified bins of some fixed width 0.01 and the upper boundary values of the intervals have been stored in an array called z[i]. The binning of the data has been done by counting the number of datapoints in each interval between z[i] and z[i+1]. For this problem, I have made 30 intervals starting from 0 to 0.3, the width of the interval being 0.01.
//creating bins in z
zmin = 0.01;
for(i=0;i<30;i++){
z[i] = 0.0;
}
for(i=0;i<30;i++){
z[i] = zmin+i*zmin;
//binning the data
for(i=1;i<30;i++){
for (k=0;k<100000;k++){
if(data[k]>z[i-1] && data[k]<=z[i]){
bincount[i] += 1;
} //if
} //k loop
} //i loop
for (k=0;k<100000;k++){
if(data[k]<=z[0]){
bincount[0] += 1;
} //if
} //k loop
The elements of z[i] are:
z[0] = 0.01, z[1] = 0.02, and so on...
The binning produces accurate results, however for some reason the bincount for the interval between z[5] (=0.06) and z[6] (0.07) comes out to be 0, even though the actual count in my original data is non-zero. Similarly the bincount for the interval between z[6] (=0.07) and z[7] (=0.08) gives an erroneous result that is the total count of the two intervals mentioned above. However, when I write 0.07 within the if-statement instead of z[6] (which I tried separately), it gives the correct result.
I have also verified if the array z[i] stores the values correctly, which seems fine. Hence, I am confused as to why this problem arises only for the particular intervals with boundary z[6], while the other bins are giving correct results. Am I doing something wrong here?