I apologize for any misunderstanding here with data types or the way C operates (I am relatively new to learning still). Can someone explain to me why I am unable to get the desired output (see below)?
#include<stdio.h>
#include<math.h>
#define size 10
typedef unsigned int Uint16;
typedef long double float64;
Uint16 i;
float64 normalization = 0;
void main() {
for (i=0;i<size;i++)
{
normalization = normalization + i; // same as "normalization += i;"
printf("\n --------");
printf("\n normalization %f", normalization);
}
return;
}
The console output is the following:
--------
normalization 0.000000
--------
normalization -0.000000
--------
normalization -2.000000
--------
normalization -2.000000
--------
normalization -0.000000
--------
normalization -3105036184601417900000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000.000000
--------
normalization -0.000000
--------
normalization -26815615859885194000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000.000000
--------
normalization -0.000000
--------
normalization -0.000000
I am expecting the output to be:
normalization = 0
normalization = 1
normalization = 3
normalization = 6
normalization = 10
normalization = 15
// and so forth...