0

At the moment I'm selecting lots of rows out of a database table (75681), putting them into a datatable and then performing a foreach look on each row

foreach (DataRow row in dt.Rows)
{
   //Updating record
}

I'm looking to try and split this down into chunks of 500 or 1000 for performance, but I'm unsure how to go about this?

Can I tell the foreach loop to process 500 at a time, or do I have to somehow split the datatable into 500 row chunks?

Hope the above makes sense!

Andrew

Andrew D
  • 106
  • 1
  • 14

1 Answers1

0

At the moment I'm selecting lots of rows out of a database table (75681), putting them into a datatable and then performing a foreach look on each row

Why the heck are you materializing all records in memory? I hope not for this loop.

or do I have to somehow split the datatable into 500 row chunks?

What about you just put the DataReader into the loop, materialize into an object then process down there. Parallels framework comes to my mind.

TomTom
  • 61,059
  • 10
  • 88
  • 148
  • TBH, I've not been working with C# for long, I'm currently writing an MVC project to replace a RoR app and I'm migrating the data from PostgreSQL to MS SQL, I'm selecting all the rows from my PostgreSQL database and then inserting them into MS SQL.. should I not be adding each row to the data table then? – Andrew D Dec 16 '12 at 17:49
  • 1
    No. This is a pure transfer job? ZERO sense to keep more than the rows yo ureally work on AT THIS MOMENT in memory. Loop along one table, cut the data into larger segments and issue blocked insert statements on the other side (i.e. if possible not one per line - that will be slow). – TomTom Dec 16 '12 at 17:54