Here is the essence of what I am trying to do:
double *E; //Array of doubles
int N; //The eventual size of the array, typically >1
// some code where variable N gets assigned
//inside of some function
E = malloc(sizeof(double)*N);
printf("size of E = %lu\n",sizeof(E)/sizeof(E[0])); //checking size of array E
The output of this code is "size of E = 1", regardless of the actual value of N. Why does the malloc() function not allocate the correct size in memory?
I know this seems very rudimentary, but I cannot understand why this would not work.
Any insight would be greatly appreciated.