I am working on a c# desktop application
. I have a bit string and i want to toggle
it.
c3 = DecimalToBinary(Convert.ToInt32(tbVal3.Text)).PadLeft(16, '0');
// above c3 is 0000001011110100
spliting the above string into two (substring)
string part1 = c3.Substring(0, 8); // 00000010
string part2 = c3.Substring(8, 8); // 11110100
For part1
the MSB of the first octet shall be set to 1 and for part2
thus this bit shall be shifted into the LSB of the first octet the MSB of the second (last) octet shall be set to 0,thus this bit shall be shifted into the LSB of the first octet. This gives binary part1 = 10000101
and part2 = 01110100
I have checked this solution Binary array after M range toggle operations but still, it's not understandable.
Rule
in the case of the application context name LN referencing with no ciphering
the arc labels of the object identifier are (2, 16, 756, 5, 8, 1, 1);
• the first octet of the encoding is the combination of the first two
numbers into a single number, following the rule of
40*First+Second -> 40*2 + 16 = 96 = 0x60;
• the third number of the Object Identifier (756) requires two octets: its
hexadecimal value is 0x02F4, which is 00000010 11110100, but following the above rule,
the MSB of the first octet shall be set to 1 and the MSB of the second (last) octet shall
be set to 0, thus this bit shall be shifted into the LSB of the first octet. This gives
binary 10000101 01110100, which is 0x8574;
• each remaining numbers of the Object Identifier required to be encoded on one octet;
• this results in the encoding 60 85 74 05 08 01 01.
How can I perform this toggle with binary strings?
Any help would be highly appreciated