1

Below code works first two times thru and then on the third time the convert to ulong fails and gives me a 0XCF instead of 0xF3. Any idea what the problem is?? Seems like a bug in the VS 2010 to_long. binary '11110011' should convert to Hex F3! Here are the results when running debug under VS 2010.

1ST TIME bc_bit_char b'11000011' converts to k = x'000000c3'; 2ND TIME: bc_bit_char b'00111100' converts to k = x'0000003c' 3RD TIME: bc_bit_char b'11110011' converts to k = x'000000cf' WRONG! s/b x'000000f3'

    std::bitset<8> bc_bit_char (00000000);
    unsigned char bc_char=' ', bc_convert_char=' ';
    unsigned long k=0, bc_rows=0;

    k = bc_bit_char.to_ulong(); // convert 8 bits to long integer with same bits 
    bc_convert_char = static_cast<unsigned char> (k); // convert long integer to unsigned 
Duffy
  • 9
  • 1
  • How are you declaring the useful `std::bitset`s? E.g. `std::bitset<8> bc_bit_char (00000000)` misleads to think you write binary 0s and 1s there verbatim. – Yirkha Apr 17 '14 at 00:33
  • Souns like you have your MSB/LSB mixed up. The first two patterns are palindromes. – Mark Ransom Apr 17 '14 at 00:33
  • Show us the program that actually does what you are talking about. What you've posted does nothing. – ooga Apr 17 '14 at 00:34
  • This is a large program. I am only showing the two lines of code that are in a loop. That is what I mean 1st time thru, 2nd time thru (within the same instance of the program). disclaimer - If you see a palindrome, it is sheer coincidence. – Duffy Apr 17 '14 at 01:07
  • The bitset spec says to_long function gives an unsigned long Integer value with the same bit representation as the bitset object. So how does X'cf' represent the bitset submitted? Isn't x'cf' binary b'11001111'?? That is not the same bitset submitted to the function. @Mark - maybe you are on to something. What I am ultimately trying to do is convert the 8 bit bitset to an unsigned char so I can add to a c++ string. But if this part doesn't work neither does the rest. Got the code from here: http://stackoverflow.com/questions/11068204/is-it-possible-to-convert-bitset8-to-char-in-c – Duffy Apr 17 '14 at 01:10
  • @MarkRansom - I think you are on to something! How is a x'cf' represented in binary c++ x386 machine? is it little endian? and then the lsb is on the left part of the nibble? so the binary values are for a byte 1248 1248, then x'cf' = b'1111 0011'? Maybe the "bug" (which I do have) is not where I think it is. I will look into the binary representation thing. I'm used to IBM mainframe - which is big endian.... – Duffy Apr 17 '14 at 02:07
  • The labeling of bits is independent of endianness. In x86 bit 0 is the least significant. In your `b'11110011'` which bit do you put into bit 0, the first or the last? – Mark Ransom Apr 17 '14 at 02:22
  • @MarkRansom To answer your question. I am building the bitset bc_bit_char left to right using subscripting and when I say bc_bit_char[0] it is left most, first bit, and the LSB. FYI - IBM mainframe values the bits in a byte like this 8421 8421 (if 1st bit is on value is 8, 2nd bit is 4, etc.) and is big endian. IBM mainframe HEX F3 = b'1111 0011' The x86 values the bits in a byte like this 1248 1248 and is little endian so HEX CF is then b'1111 0011'!! So there is no bug in the to_long conversion. You first is answer is brilliant and correct. Thank you for fixing me :) – Duffy Apr 17 '14 at 03:08
  • This is question is answered. Don't know how to mark it as answered. Anyone? Thanks all! Duffy – Duffy Apr 17 '14 at 03:10

1 Answers1

1

There is a confusion as to which bit is most significant and which is least significant. In the binary constants you provide, the left-most bit is bit 0 which is least significant. In the hex value provided by to_long, the left-most bit is bit 7 which is most significant. If you reverse one or the other then they will be equal.

Mark Ransom
  • 299,747
  • 42
  • 398
  • 622