I often use Char.IsDigit
to check if a char
is a digit which is especially handy in LINQ queries to pre-check int.Parse
as here: "123".All(Char.IsDigit)
.
But there are chars which are digits but which can't be parsed to int
like ۵
.
// true
bool isDigit = Char.IsDigit('۵');
var cultures = CultureInfo.GetCultures(CultureTypes.SpecificCultures);
int num;
// false
bool isIntForAnyCulture = cultures
.Any(c => int.TryParse('۵'.ToString(), NumberStyles.Any, c, out num));
Why is that? Is my int.Parse
-precheck via Char.IsDigit
thus incorrect?
There are 310 chars which are digits:
List<char> digitList = Enumerable.Range(0, UInt16.MaxValue)
.Select(i => Convert.ToChar(i))
.Where(c => Char.IsDigit(c))
.ToList();
Here's the implementation of Char.IsDigit
in .NET 4 (ILSpy):
public static bool IsDigit(char c)
{
if (char.IsLatin1(c))
{
return c >= '0' && c <= '9';
}
return CharUnicodeInfo.GetUnicodeCategory(c) == UnicodeCategory.DecimalDigitNumber;
}
So why are there chars that belong to the DecimalDigitNumber
-category("Decimal digit character, that is, a character in the range 0 through 9...") which can't be parsed to an int
in any culture?