0

i m trying to make a program to convert a number into it's binary.

Code:

    #include<iostream>
    #include<algorithm>
    #include<bitset>
    using namespace std;
    int main()
    {
        int a;
        string k;
        bitset<CHAR_BIT> n;
        cin>>a;
        n=bitset<CHAR_BIT>(a);
        cout<<n<<" ";
            return 0;
   }

The program gives wrong answer for 585 as it contains more than 6 binary digits. How can i such greater numbers?

Vaibhav
  • 6,620
  • 11
  • 47
  • 72

2 Answers2

4

585 mod 256 = 73 (assuming CHAR_BIT is 8)
73 in base 2 = 0b01001001
The program does print 01001001.
I don't see there's anything wrong.

If you want to store the whole range of a, the bitset should be declared as

bitset<CHAR_BIT * sizeof(a)> n (a);
kennytm
  • 510,854
  • 105
  • 1,084
  • 1,005
0

A bitset has a fixed number of bits. You specify bitset<CHAR_BIT> -- on most systems, CHAR_BIT is 8 so you will have an 8-bit bitset. When you try to stuff a bigger number into the bitset, the most significant bits are discarded.

If you know in advance the largest numbers you will have to deal with, you can specify eg bitset<16> or bitset<32>. If you don't, you may have to use some other datatype.

Community
  • 1
  • 1
Philip Potter
  • 8,975
  • 2
  • 37
  • 47