The sizeof() function in c++ is behaving very weird. I am unable to make any sense out of it. I was writing code to implement the binary search algorithm to find an element in an array. To have a lesser number of arguments in the function, I decided to get the length of the array using sizeof() function. Here is the code:
#include <bits/stdc++.h>
using namespace std;
int binarySearch(int arr[], int target) {
int low = 0;
int high = sizeof(arr) / sizeof(arr[0]) - 1;
cout << "high is: " << high << endl;
while (low <= high) {
int mid = low + (high - low) / 2;
if (arr[mid] == target) return mid;
else if (arr[mid] > target) high = mid - 1;
else low = mid + 1;
}
return -1;
}
int main() {
int array[] = {1, 2, 3, 4, 5, 6, 7, 8, 9, 10};
cout << binarySearch(array, 10) << endl;
cout << "size of behaviour in main: " << sizeof(array);
return 0;
}
The output is:
high is: 0
-1
size of behaviour in main: 40
On debugging, I realised that inside binarySearch function, the sizeof(arr) is giving me the size of int type i.e. 4. so sizeof(arr)/sizeof(arr[0])-1 = 4/4-1 = 0.
However, in int main, the sizeof(array) is giving me the size of int times length of the array i.e. 4*10 = 40.
What explains this change in behaviour of the sizeof() function with respect to its usage in int main and in function definition?