1

Actually,I got a requirement like save in Db the combinations of 9 digit number in encrypted format.

So,I have used a very basic algorithm for encryption and thought of using for loop until 999,999,999 and save the records in DataTable and BulkCopy to SQL.

My Program goes like this :

DataTable DtData = new DataTable();
DtData.Columns.Add("ENCRYPTED_DATA", typeof(string));        

for (Int64 i = 1; i <= 999999999; i++)
{
    string Number = i.ToString("D9");

    string Encrypt = EncryptDecrypt.Encrypt(Number);

    DtData.Rows.Add(Encrypt);

    if (i % 100000 == 0)
    {
        GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);
    }
}

But my problem is I'm getting out of memory exception after some time.
Is there a way to Garbage Collect or some other way and save the Memory Consumption?
Actually the Code of GC.Collect is not at all reducing the Memory Usage as I can see in TaskManager.

My PC RAM is 16GB and the approximate time taken for processing 8,000,000 records is 7 minutes.
After consuming 16GB it is giving out OutOfMemoryException as per TaskManager.
Is there a way to reduce the MemoryConsumption and make my forloop fully execute?

Humayun Shabbir
  • 2,961
  • 4
  • 20
  • 33
RealSteel
  • 1,871
  • 3
  • 37
  • 74

2 Answers2

2

The line

GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);

does not free the data already in DataTable DtData. I don't know the size of strings that you are creating but you are creating thousands of strings and adding them to DataTable DtData.

These strings in DtData are never eligible for Garbage Collection within the scope of this loop.

You should periodically commit the numbers to the database as in the following

DataTable DtData = new DataTable();
DtData.Columns.Add("ENCRYPTED_DATA", typeof(string));        

for (Int64 i = 1; i <= 999999999; i++)
{
    string Number = i.ToString("D9");

    string Encrypt = EncryptDecrypt.Encrypt(Number);

    DtData.Rows.Add(Encrypt);

    //vary this number depending on your performance testing and application needs
    if (i % 100000 == 0)
    {
        //instead of this 
        //GC.Collect(GC.MaxGeneration, GCCollectionMode.Forced);


        //commit changes and refresh your DataTable

        DoSomeDatabaseCommitHere(DtData);
        DtData = new DataTable();

    }
}

You also can employ a Collection class - MSDN instead of DataTable for more options and asynchronous uploading.

jordanhill123
  • 4,142
  • 2
  • 31
  • 40
  • Yes.I'm going to try that.Thank you for supporting – RealSteel Aug 01 '14 at 02:13
  • As I've calculated it will take more than 17 hours for completing the whole cycle. Is there a way to make it fast? Instead of DataTable,If I use a List,will it help? – RealSteel Aug 01 '14 at 02:59
  • 1
    A list should be faster, though a HashSet may be even more performant. See http://stackoverflow.com/questions/150750/hashset-vs-list-performance for more info on List vs HashSet. Possibly also modify the line `if (i % 100000 == 0)` to a larger number to have fewer bulk commits. In addition, you can look at using background threads for commiting while encrypting. See my link regarding various Collection classes that would assist with this. – jordanhill123 Aug 01 '14 at 17:45
1

Your DtData is filling up and taking all your memory.

All the garbage collection in the world isn't going to help.

Save your data now and then to the database and empty the DtData DataTable.

Steve Wellens
  • 20,506
  • 2
  • 28
  • 69