I am observing an interesting result when I typecast an output:
Here is the code snippet:
int bitSize = (int)log10(1.0*16)/log10(2.0); //bistsize = 3 it should be 4
int temp = log10(1.0*16)/log10(2.0); //temp = 4
Basically I want to take log2(16) which should be 4. I think my understanding of typecasting is wrong. Any suggestions?
Thanks