I have a C# method that needs to retrieve the first character of a string, and see if it exists in a HashSet that contains specific unicode characters (all the right-to-left characters).
So I'm doing
var c = str[0];
and then checking the hashset.
The problem is that this code doesn't work for strings where the first char's code point is larger than 65535.
I actually created a loop that goes through all numbers from 0 to 70,000 (the highest RTL code point is around 68,000 so I rounded up), I create a byte array from the number, and use
Encoding.UTF32.GetString(intValue);
to create a string with this character. I then pass it to the method that searches in the HashSet, and that method fails, because when it gets
str[0]
that value is never what it should be.
What am I doing wrong?