12

I have a list with around 190 elements in it for now. How can I split the list into smaller lists with a max of 50 elements in each list?

The result could be lists of 50, 50, 50 and 40 elements.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Awesome
  • 325
  • 3
  • 6
  • 14
  • possible duplicate of [Split List into Sublists with LINQ](http://stackoverflow.com/questions/419019/split-list-into-sublists-with-linq) – psubsee2003 Apr 13 '14 at 11:45

4 Answers4

23

Assuming you mean List<T>, you can use the GetRange method repeatedly. Heck, you could do this with LINQ:

var lists = Enumerable.Range(0, (list.Count + size - 1) / size)
      .Select(index => list.GetRange(index * size,
                                     Math.Min(size, list.Count - index * size)))
      .ToList();

Or you could just use a loop, of course:

public static List<List<T>> Split(List<T> source, int size)
{
    // TODO: Validate that size is >= 1
    // TODO: Prepopulate with the right capacity
    List<List<T>> ret = new List<List<T>>();
    for (int i = 0; i < source.Count; i += size)
    {
        ret.Add(source.GetRange(i, Math.Min(size, source.Count - i)));
    }
    return ret;
}

This is somewhat more efficient than using GroupBy, although it's limited to List<T> as an input.

We have another implementation using IEnumerable<T> in MoreLINQ in Batch.cs.

Jon Skeet
  • 1,421,763
  • 867
  • 9,128
  • 9,194
  • Historical : I never ever saw Jon, not even a splitsecond, on -1. I will ask in his place : would the minus voter care to elaborate? --> and ofcourse he beat me to that one :-) – Peter Apr 08 '11 at 07:59
  • 1
    @Jon, I didn't downvote but I think it was due to the late example. – Darin Dimitrov Apr 08 '11 at 08:00
  • -1 for Leaving half answer with "coming soon" doesnt make sense. – Sanjeevakumar Hiremath Apr 08 '11 at 08:01
  • @Sanjeevakumar: `GetRange` is really the heart of it though... the rest is pretty trivial, and that shows the efficient way of creating one list from another. I wanted to make *that* information available while creating the sample code. Of course, I've now expanded the answer with two examples, and a link to another implementation which is still more efficient than using GroupBy... – Jon Skeet Apr 08 '11 at 08:03
  • +1 for informative. Another downvoter however, even after the examples as far as I can see. That minus vote I would like seen explained (just out of my curiosity) – Peter Apr 08 '11 at 08:09
  • Thanks Jon. The foreach seems to be the most simple and clean answer. – Awesome Apr 08 '11 at 08:10
  • 1
    Why would you post a one liner about "sample coming"? What's the benefit of that? – Mikael Östberg Apr 08 '11 at 08:22
  • @MikeEast: Because `GetRange` was the most important part, in my view. It's a simple and efficient way of getting part of a list, as another list. Once you know about it, using it is easy. – Jon Skeet Apr 08 '11 at 08:24
  • Note that for `size == 0`, this function will enter infinite loop. – Tomer Sep 20 '21 at 01:35
  • @Tomer: I've added a validation TODO in there. – Jon Skeet Sep 20 '21 at 05:31
10

You could use LINQ:

var list = Enumerable.Range(1, 190);
var sublists = list
    .Select((x, i) => new { Index = i, Value = x })
    .GroupBy(x => x.Index / 50)
    .Select(x => x.Select(v => v.Value).ToList())
    .ToArray();
Darin Dimitrov
  • 1,023,142
  • 271
  • 3,287
  • 2,928
1

I've attempted a recursive approach. Just to see what it would look like.

List<List<T>> SplitIntoChunks<T>(IEnumerable<T> originalList, int chunkSize)
{
    if(originalList.Take(1).Count() == 0)
    {
        return new List<List<T>>();
    }

    var chunks = new List<List<T>> {originalList.Take(chunkSize).ToList()};
    chunks.AddRange(SplitIntoChunks(originalList.Skip(chunkSize), chunkSize));
    return chunks;
}
Matt Ellen
  • 11,268
  • 4
  • 68
  • 90
0
var list = new List<int>(Enumerable.Range(1,190));
var page_size = 50;
var max_pages = 1 + list.Count() / page_size;

for(int page = 1; page <= max_pages; page++) {
  var chunk = list.Skip(page_size * (page-1)).Take(page_size);
  // do whatever
}
James Kyburz
  • 13,775
  • 1
  • 32
  • 33