-1

What is the best way to convert a string of digits into their equivalent ASCII characters?
I think that I am over-complicating this.

Console.WriteLine($"Enter the word to decrypt: ");
//store the values to convert into a string
string vWord = Console.ReadLine(); 

for (int i = 0; i < vWord.Length; i++)
{
    int convertedIndex = vWord[i];
    char character = (char)convertedIndex;
    finalValue += character.ToString();

    Console.WriteLine($"Input: {vWord[i]} Index: {convertedIndex} Char {character}");
}
Jimi
  • 29,621
  • 8
  • 43
  • 61
  • `vWord[i]` is `char` already, it's just implicitly castable to `int` – ProgrammingLlama Dec 15 '18 at 09:06
  • `convertedIndex` and `vWord[i]` are always equal, there is no need to use it twice – Dusan Radovanovic Dec 15 '18 at 09:08
  • Possible duplicate: https://stackoverflow.com/questions/5348844/how-to-convert-a-string-to-ascii –  Dec 15 '18 at 09:21
  • Thanks, I need to be more specific with my questions. If the user enters 97 98 99 I want it to display the chars "a b c". My problem is with the loop I am using. I tried string[] words = phrase.Split(' '); foreach (var word in words) but still no success. Need a break and come back with a fresh perspective. – Ricky Sinclair Dec 15 '18 at 09:29
  • 1
    Please edit your question with that information, Ricky :) – ProgrammingLlama Dec 15 '18 at 09:32

2 Answers2

2

If the expected input values are something like this: 65 66 67 97 98 99, you could just split the input and cast the converted int values to char:

string vWord = "65 66 67 97 98 99";
string result = string.Join("", vWord.Split().Select(n => (char)(int.Parse(n))));

Console.WriteLine($"Result string: {result}");

This method, however, doesn't perform any error checking on the input string. When dealing with user input, this is not a great idea. We better use int.TryParse() to validate the input parts:

var result = new StringBuilder();
var ASCIIValues = vWord.Split();

foreach (string CharValue in ASCIIValues) {
    if (int.TryParse(CharValue, out int n) && n < 127) {
        result.Append((char)n);
    }
    else {
        Console.WriteLine($"{CharValue} is not a vaid input");
        break;
    }
}
Console.WriteLine($"Result string: {result.ToString()}");

You could also use the Encoding.ASCII.GetString method to convert to string the Byte array generated by the byte.Parse method. For example, using LINQ's Select:

string vWord = "65 66 67 97 98 267";
try
{
    var CharArray = vWord.Split().Select(n => byte.Parse(n)).ToArray();
    string result = Encoding.ASCII.GetString(CharArray);
    Console.WriteLine($"String result: {result}");
}
catch (Exception)
{

    Console.WriteLine("Not a vaid input");
}

This will print "Not a vaid input", because one of the value is > 255.


Should you decide to allow an input string composed of contiguous values:

651016667979899112101 => "AeBCabcpe"

You could adopt this variation:

string vWord2 = "11065666797989911210110177";
int step = 2;
var result2 = new StringBuilder();

for (int i = 0; i < vWord2.Length; i += step)
{
    if (int.TryParse(vWord2.Substring(i, step), out int n) && n < 127)
    {
        if (n <= 12 & i == 0) {
            i = -3; step = 3; ;
        }
        else if(n <= 12 & i >= 2) {
            step = 3; i -= step;
        }
        else {
            result2.Append((char)n);
            if (step == 3) ++i;
            step = 2;
        }
    }
    else {
        Console.WriteLine($"{vWord2.Substring(i, step)} is not a vaid input");
        break;
    }
}
Console.WriteLine($"Result string: {result2.ToString()}");
Result string: nABCabcpeeM

As Tom Blodget requested, a note about the automatic conversion between ASCII characters-set and Unicode CodePoints.

This code produces some ASCII characters using an integer value, corresponding to the character in the ASCII table, casting the value to a char type and converting the result to a Windows standard Unicode (UTF-16LE) string.
Why there's no need to explicitly convert the ASCII chars to their Unicode representation?
Because, for historical reasons, the lower Unicode CodePoints directly map to the standard ASCII table (the US-ASCII table).
Hence, no conversion is required, or it can be considered implicit.
But, since the .Net string type uses UTF-16LE Unicode internally (which uses a 16-bit unit for each character in the lower Plane, two 16-bit code units for CodePoints greater or equal to 216), the memory allocation in bytes for the string is double the number of characters.
In the .Net Reference Source, StringBuilder.ToString() will call the internal wstrcpy method:

wstrcpy(char *dmem, char *smem, int charCount)

which will then call Buffer.Memcpy:

Buffer.Memcpy((byte*)dmem, (byte*)smem, charCount * 2);

where the size in bytes is set to charCount * 2.

Since the first draft, in the '80s (when the first Universal Character Set (UCS) was developed), one of the primary objectives of the IEEE and the Unicode Consortium (the two main entities that were developing the standard) was to preserve the compatibility with the pre-existing 256 character-set widely used at the time.

Preserving the CodePoints definition, thus preserving compatibility over time, is a strict rule in the Unicode world. This concept and rules apply to all modern variable length Unicode encodings (UTF-8, UTF-16, UTF-16LE, UTF-32 etc.) and to all CodePoints in the Basic Multilingual Plane (CodePoints in the ranges U+0000 to U+D7FF and U+E000 to U+FFFF).

On the other hand, there's no explicit guarantee that the same Local CodePage encoding (often referred to as ANSI Encoding) will produce the same result in two machines, even when the same System (and System version) is in use.

Some other notes about Localization and the Unicode Common Locale Data Repository (CLDR)

Jimi
  • 29,621
  • 8
  • 43
  • 61
  • 1
    The problem with the non-separated string is that you cannot get characters above 99, which is a `c`. So you miss out on most lowercase characters. You wil not have that problem with the space-separated input – Hans Kesting Dec 15 '18 at 10:13
  • Thanks Mate, this put me on the right track. Much appreciated! – Ricky Sinclair Dec 15 '18 at 10:21
  • C# uses UTF-16. It's a long story how what's given as an ASCII code unit can be used as a UTF-16 code unit. But a short comment is in order. – Tom Blodget Dec 17 '18 at 23:53
  • @Tom Blodget Do you mean, I should add a note about why an ASCII value can be translated to a standard Windows string (UTF-16LE) without encoding? In other words, an historical summary of the reasons why the lower Unicode CodePoints are directly mapped to the pre-existing US-ASCII table? I can do that, sure; can you say why you deem this important, here? – Jimi Dec 18 '18 at 00:47
  • The problem statement implies a character encoding conversion. Casting only works in very limited combinations of source character encodings. So, I suggest a code comment stating so, sufficient for the uninitiated to go do research. Or make the conversion explicit with the standard Encoding methods. – Tom Blodget Dec 18 '18 at 00:54
  • Nice explanation. See, it is a long way conceptually from ASCII bytes to .NET `char` (UTF-16 code unit) and `String`. – Tom Blodget Dec 23 '18 at 12:34
1

You can break the problem down into two parts:

P1. You want to take a string input of space-separated numbers, and convert them to int values:

private static int[] NumbersFromString(string input)
{
    var parts = input.Split(new string[] { " " }, StringSplitOptions.RemoveEmptyEntries);
    var values = new List<int>(parts.Length);
    foreach (var part in parts)
    {
        int value;
        if (!int.TryParse(part, out value))
        {
            throw new ArgumentException("One or more values in the input string are invalid.", "input");
        }
        values.Add(value);
    }
    return values.ToArray();
}

P2. You want to convert those numbers into character representations:

private static string AsciiCodesToString(int[] inputValues)
{
    var builder = new StringBuilder();
    foreach (var value in inputValues)
    {
        builder.Append((char)value);
    }
    return builder.ToString();
}

You can then call it something like this:

Console.WriteLine(AsciiCodesToString(NumbersFromString(input)));

Try it online

ProgrammingLlama
  • 36,677
  • 7
  • 67
  • 86