0

I like to select all data as unpredictable number of quaternary packages (packs which each contains four item) something like the fallowing:

foreach(var quaternary in myEnauerable.ToQuaternaryPackages())
{
    //Whatever (Like: l=page.Add(new List()))
    foreach(var item in quaternary)
    {
        //Whatever (Like: l.Add(item))
    }
}
Mohsen
  • 4,000
  • 8
  • 42
  • 73

5 Answers5

4

I think you're looking for something like the MoreLINQ Batch method:

foreach (var batch in myEnumerable.Batch(4))
{
    foreach (var item in batch)
    {
        // ...
    }
}

Note that the final batch will have fewer than 4 items if the total number isn't divisible by 4. (For example, if there are 14 items initially, you'll get three batches of 4 and then a batch of 2.)

You can fetch MoreLINQ either as one big Nuget package, or as individual packages (e.g. the Batch package).

Jon Skeet
  • 1,421,763
  • 867
  • 9,128
  • 9,194
1

First, make a trivial Batch method (put it in a static class):

public static IEnumerable<IEnumerable<T>> Batch<T>(this IEnumerable<T> source, int batchSize)
{
    List<T> temp = new List<T>();
    foreach (T item in source)
    {
        temp.Add(item);
        if (temp.Count == batchSize)
        {
            yield return temp.Select(n => n);
            temp.Clear();
        }
    }
    if (temp.Any())
    {
        yield return temp.Select(n => n);
    }
}

And then just use it like this:

foreach(var quaternary in myEnauerable.Batch(4))
{
    //Whatever (Like: l=page.Add(new List()))
    foreach(var item in quaternary)
    {
        //Whatever (Like: l.Add(item))
    }
}
It'sNotALie.
  • 22,289
  • 12
  • 68
  • 103
  • Your batch method is a bit broken IMO - as soon as you fetch the "next" batch, you end up making the previous one unusable (as you're reusing the list). That's fine for some uses, but you'd need to be very clear about it. – Jon Skeet Jun 12 '13 at 06:58
1

Shamelessly stolen from Troy Goode's PagedList library: https://github.com/TroyGoode/PagedList/blob/master/src/PagedList/PagedListExtensions.cs

public static IEnumerable<IEnumerable<T>> Partition<T>(this IEnumerable<T> superset, int pageSize)
{
  if (superset.Count() < pageSize)
    yield return superset;
  else
  {
    var numberOfPages = Math.Ceiling(superset.Count() / (double)pageSize);
    for (var i = 0; i < numberOfPages; i++)
      yield return superset.Skip(pageSize * i).Take(pageSize);  
  }
}

Use it like so:

var result=myEnumerable.Partition(4);
Robert McKee
  • 21,305
  • 1
  • 43
  • 57
1

If you really need the result to be unpredictable (i.e. random), I suggest the following algorithm:

  1. Create a random permutation of all elements in the list.
  2. Bucket them into quadruples.

For the second part, many good answers have already been provided. For the first part, there is a great in-depth series on creating permutations with LINQ by Eric Lippert.

Heinzi
  • 167,459
  • 57
  • 363
  • 519
1

it's quite simple using linq:

public static IEnumerable<IEnumerable<T>> Batch<T>(this IEnumerable<T> source, int batchSize)
{
    for (int i = 0; i < source.Count(); i+=batchSize)
    {
        yield return source.Skip(i).Take(batchSize);
    }
}

that way you split the list by batchSize of items, and if the count of the items is not dividable by batch size, the last iteration will yield the rest of the enumerable.

Igarioshka
  • 677
  • 3
  • 14
  • thanx but i need IEnumerable> as Batches – Mohsen Jun 12 '13 at 07:42
  • sorry, my bad. it **will** return IEnumerable>... as it yields the batch. just a mistype in the return statement. updating the answer. – Igarioshka Jun 12 '13 at 07:45
  • 1
    Note that this will iterate over the source sequence for each batch - that could be disastrous for performance in some cases. Personally when I write LINQ operators, I try to make sure I only ever iterate over the sequence once. Also, you need `Count()` instead of `Count`. – Jon Skeet Jun 12 '13 at 08:01
  • @JonSkeet fixed the `Count` bit. Will dig deeper into optimizing the code, thanks for the hint. – Igarioshka Jun 12 '13 at 08:12