This is what I use to get the length of an array:
sizeof(counts) / sizeof(unsigned long)
Now I want to make a simple function out of it like:
int length(unsigned long* input)
{
return (sizeof(input) / sizeof(unsigned long));
}
unsigned long array[3];
sizeof(array) = 12
sizeof(unsigned long) = 4
sizeof(array) / sizeof(unsigned long) = 3
lenght(array) = 1
Inside length() sizeof(array) returns 4
The compiler will complain when I change the function to:
int length(unsigned long input)
{
return (sizeof(input) / sizeof(unsigned long));
}
I'm either stoned or just dumb, what am I doing wrong?