In my application i'm trying to display the bit representation of double variables. It works for smaller double variables. Not working for 10^30 level.
Code:
#include <iostream>
#include <bitset>
#include <limits>
#include <string.h>
using namespace std;
void Display(double doubleValue)
{
bitset<sizeof(double) * 8> b(doubleValue);
cout << "Value : " << doubleValue << endl;
cout << "BitSet : " << b.to_string() << endl;
}
int main()
{
Display(1000000000.0);
Display(2000000000.0);
Display(3000000000.0);
Display(1000000000000000000000000000000.0);
Display(2000000000000000000000000000000.0);
Display(3000000000000000000000000000000.0);
return 0;
}
Output:
/home/sujith% ./a.out
Value : 1e+09
BitSet : 0000000000000000000000000000000000111011100110101100101000000000
Value : 2e+09
BitSet : 0000000000000000000000000000000001110111001101011001010000000000
Value : 3e+09
BitSet : 0000000000000000000000000000000010110010110100000101111000000000
Value : 1e+30
BitSet : 0000000000000000000000000000000000000000000000000000000000000000
Value : 2e+30
BitSet : 0000000000000000000000000000000000000000000000000000000000000000
Value : 3e+30
BitSet : 0000000000000000000000000000000000000000000000000000000000000000
My worry is why bitset always gives 64, zero for later 3. Interestingly "cout" for the actual values works as expected.