3

I have string "abcdefghij", and i want to have this string in bits. I tried like this:

byte[] K = new byte[10 * sizeof(char)];
K = System.Text.Encoding.UTF8.GetBytes(args[1]);
var d = new BitArray(k);

In K i have [0x61, 0x62, ..., 0x6a] - it's ok. But in d i have [1000 0110, 0100 0110, ..., 0101 0110] (not exactly as i typed, it's just array of true and false). In d it's converted as bit[0]...bit[7], from least to most sugnifficant bit. It's not what i want.

I want save bits from most sugnifficant to least: [0110 0001, 0110, 0010, ..., 0110 1010].

How can i deal with it?

Irshad
  • 3,071
  • 5
  • 30
  • 51
murzagurskiy
  • 1,273
  • 1
  • 20
  • 44

1 Answers1

0

I found the answer. In my situation i can just use that piece of code from this post:

byte[] bytes = ...
bool[] bits = bytes.SelectMany(GetBits).ToArray();

...

IEnumerable<bool> GetBits(byte b)
{
    for(int i = 0; i < 8; i++)
    {
        yield return (b & 0x80) != 0;
        b *= 2;
    }
}

Now in bits i have what i want.

And this is inverse transformation:

static byte[] GetBytes(bool[] bits)
{
    byte[] bytes = new byte[bits.Length / 8];
    for (int i = 0; i < bits.Length / 8; i++)
    {
        for (int j = 0; j < 8; j++)
        {
            bytes[i] |= (byte) (Convert.ToByte(bits[(i * 8) + (7 - j)]) << j);
        }
    }
    return bytes;
}
Community
  • 1
  • 1
murzagurskiy
  • 1,273
  • 1
  • 20
  • 44