This is just a language curiosity rather than problem.
The ElementAt() method of IEnumerable accepts integers to get the Nth element of an enumerable collection.
For example
var list = new List<char>() { 'a', 'b', 'c' };
var alias = list.AsEnumerable();
int N = 0;
alias.ElementAt(N); //gets 'a'
All good, however, why doesn't ElementAt() accept unsigned integers (uint) ? e.g.
uint N = 0;
alias.ElementAt(N); //doesn't compile
I can understand why ElementAt could accept integers to allow negative indices (e.g. Python allows negative indices where list[-1] refers to the last element), so it makes sense to have accept negative indices for those languages that do use them even if C# doesn't.
But I can't quite see the reasoning for disallowing unsigned integers, if anything an unsigned integer is better as it guarantees that the index will not be negative (so only the upper bound of the range needs to be checked).
The best thing I could think of is perhaps the CLR team decided to standardize on signed integers to allow other languages (e.g. Python) that do have negative indices to use the same code and ensure the ranges would consistent across languages.
Does anyone have a better/authoritative explanation for why .ElementAt() doesn't allow unsigned ints ?
-Marcin