29

I have a function to convert string to hex as this,

public static string ConvertToHex(string asciiString)
{
    string hex = "";
    foreach (char c in asciiString)
    {
         int tmp = c;
         hex += String.Format("{0:x2}", (uint)System.Convert.ToUInt32(tmp.ToString()));
    }
    return hex;
}

Could you please help me write another string to Binary function based on my sample function?

public static string ConvertToBin(string asciiString)
{
    string bin = "";
    foreach (char c in asciiString)
    {
        int tmp = c;
        bin += String.Format("{0:x2}", (uint)System.Convert.????(tmp.ToString()));
    }
    return bin;
}
splattne
  • 102,760
  • 52
  • 202
  • 249
Nano HE
  • 1,879
  • 7
  • 28
  • 39
  • 8
    `char` => `int` => `string` => `uint` => `uint` (again?) … whoa! You’ve lost me there. – Konrad Rudolph Apr 14 '11 at 13:59
  • 1
    You seem to think that it is `ToUInt32` that is doing the conversion to the hex, but it is actually the `x2` formatting specifier to String.Format. Unfortunately, I don't think there is a `b8` format specifier. – Justin Apr 14 '11 at 14:02
  • You can implement the ICustomFormatter, shown in MSDN [link](https://msdn.microsoft.com/en-us/library/system.icustomformatter.format(v=vs.110).aspx) – antonio Oct 12 '16 at 06:54

4 Answers4

49

Here you go:

public static byte[] ConvertToByteArray(string str, Encoding encoding)
{
    return encoding.GetBytes(str);
}

public static String ToBinary(Byte[] data)
{
    return string.Join(" ", data.Select(byt => Convert.ToString(byt, 2).PadLeft(8, '0')));
}

// Use any sort of encoding you like. 
var binaryString = ToBinary(ConvertToByteArray("Welcome, World!", Encoding.ASCII));
Jaapjan
  • 3,365
  • 21
  • 25
  • 6
    Please use `System.Text.UTF8Encoding`. – JSBձոգչ Apr 14 '11 at 14:03
  • 9
    His example says asciiString as parameter. Nor do I know what format the binary array should be. But you can change the encoding on demand. – Jaapjan Apr 14 '11 at 14:04
  • @JSBangs The OP does seem to want to use ASCII. But you’re right, that’s not what the original code does and it probably wouldn’t work either. But using UTF8 does something different yet. The equivalent to OP’s code would be to use `Unicode`. – Konrad Rudolph Apr 14 '11 at 14:05
  • The OP used a variable name "asciiString", but this does not change the fact that the string is UTF-16 LE (because that's what `string` always has). In my opinion, the *only* reason to ever use a non-Unicode encoding is in the thin interface layer to a legacy system that cannot be changed. And even then, that is only until the legacy system can be replaced. Now the OP may be saying that the characters in `asciiString` are restricted to the ASCII range (7-bit values). If that is the case, the UTF-8 solution will be identical to the ASCII solution, so UTF-8 should be used anyway. – Jeffrey L Whitledge Apr 14 '11 at 14:15
  • @JSBangs Right, I totally agree with that. My comment was more in the direction that UTF8 is probably also wrong, or at least not what the code is currently doing for any codepoint > 127. – Konrad Rudolph Apr 14 '11 at 17:57
  • **-1** The OP has written the question asking for the result in the form of a human-readable `string`, not a `byte[]`. – Slipp D. Thompson Oct 02 '14 at 22:46
  • Severity Code Description Project File Line Error Argument 2: cannot convert from 'System.Collections.Generic.IEnumerable' to 'string[]' blah blah.cs 822 – behelit Nov 19 '15 at 01:13
11

It sounds like you basically want to take an ASCII string, or more preferably, a byte[] (as you can encode your string to a byte[] using your preferred encoding mode) into a string of ones and zeros? i.e. 101010010010100100100101001010010100101001010010101000010111101101010

This will do that for you...

//Formats a byte[] into a binary string (010010010010100101010)
public string Format(byte[] data)
{
    //storage for the resulting string
    string result = string.Empty;
    //iterate through the byte[]
    foreach(byte value in data)
    {
        //storage for the individual byte
        string binarybyte = Convert.ToString(value, 2);
        //if the binarybyte is not 8 characters long, its not a proper result
        while(binarybyte.Length < 8)
        {
            //prepend the value with a 0
            binarybyte = "0" + binarybyte;
        }
        //append the binarybyte to the result
        result += binarybyte;
    }
    //return the result
    return result;
}
Matthew Layton
  • 39,871
  • 52
  • 185
  • 313
3

Here's an extension function:

        public static string ToBinary(this string data, bool formatBits = false)
        {
            char[] buffer = new char[(((data.Length * 8) + (formatBits ? (data.Length - 1) : 0)))];
            int index = 0;
            for (int i = 0; i < data.Length; i++)
            {
                string binary = Convert.ToString(data[i], 2).PadLeft(8, '0');
                for (int j = 0; j < 8; j++)
                {
                    buffer[index] = binary[j];
                    index++;
                }
                if (formatBits && i < (data.Length - 1))
                {
                    buffer[index] = ' ';
                    index++;
                }
            }
            return new string(buffer);
        }

You can use it like:

Console.WriteLine("Testing".ToBinary());

and if you add 'true' as a parameter, it will automatically separate each binary sequence.

Krythic
  • 4,184
  • 5
  • 26
  • 67
2

The following will give you the hex encoding for the low byte of each character, which looks like what you're asking for:

StringBuilder sb = new StringBuilder();
foreach (char c in asciiString)
{
    uint i = (uint)c;
    sb.AppendFormat("{0:X2}", (i & 0xff));
}
return sb.ToString();
Jim Mischel
  • 131,090
  • 20
  • 188
  • 351