I've got a simple list of hashsets:
List<HashSet<TestObj>> listOfSets = new List<HashSet<TestObj>>();
When I iterate it manually, there is no additional memory allocation:
foreach(var childSet in listOfSets) {
foreach (var neighbor in childSet) {
// do something with neighbor
}
}
However, if I do the same using Linq's SelectMany
, it allocates memory on the Heap, once per childSet:
foreach(TestObj neighbor in listOfSets.SelectMany(x => x)) {
// do something with neighbor
}
Any idea why it does that? Is there a way around it? Otherwise, is there another way to return an IEnumerable<TestObj>
from a list of sets, so that it won't allocate any memory when iterating it?
Edit: It appears to have something to do with boxing. When I convert it to IEnumerable<IEnumerable<TestObj>>
then direct iteration also triggers memory allocations. Any ideas what exactly is happening?
// This code triggers malloc for every set in listOfSets
IEnumerable<IEnumerable<TestObj>> enumOfEnum = listOfSets;
foreach(var childSet in enumOfEnum) {
foreach (var neighbor in childSet) {
// do something with neighbor
}
}
Edit: Ok, so because SelectMany
is an extention for IEnumerable<IEnumerable<T>>
. When I call SelectMany
on List<HashSet<T>>
, it implicitly casts to IEnumerable<IEnumerable<T>>
, and because HashSet<T>.Enumerator
is a struct, every one of the HashSet
s is boxed (which is normal behavior when casting a struct to an interface).