0

I am facing a major problem in my application. While reading a big excel file having more then 9 hundred thousand records through oledb.jet 12.0 provider and filling it up in a dataset from the returned value after some time system through me an exception with the message outofmemory.

i tried so hard to find out some solution over to that but till now not getting any luck.

Please help me in that issue if any one has some idea about the same.

Code :

MyConn = New OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" & FileSource & ";Extended Properties=""Excel 12.0;IMEX=1""")
            MyConn.Open()
            'get the table schema information to retrive sheet name
            Dim schemaTable As DataTable = MyConn.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, New Object() {Nothing, Nothing, Nothing, "TABLE"})
            For Each dtrow As DataRow In schemaTable.Rows
                Dim iUploadStatus As Integer = 0               
                sheetName = String.Empty
                sheet = dtrow("TABLE_NAME").ToString()
                'to skip the duplicate sheet being fetched in the schema table.
                If sheet.Contains("_xlnm#_FilterDatabase") Or sheet.Contains("_FilterDatabase") Then
                    Continue For
                End If
                MyCommand = New OleDbDataAdapter _
                            ("select * from [" + sheet + "] ", MyConn)
                MyCommand.TableMappings.Add("Table", "TestTable")
                DtSet = New DataSet
                MyCommand.Fill(DtSet)

                  Using destinationConnection As New SqlConnection(Conn)
                                        ' open the connection
                                        destinationConnection.Open()
                                        Using bulkCopy As New SqlBulkCopy(Conn)
                                            ' column mappings
                                            bulkCopy.ColumnMappings.Add(P1, ColProdNum)
                                            bulkCopy.ColumnMappings.Add(P2, ColProdDesc)
                                            bulkCopy.ColumnMappings.Add(P3, ColListPrice)
                                            bulkCopy.ColumnMappings.Add(P4, ColNetPrice)
                                            bulkCopy.BatchSize = 2000
                                            bulkCopy.NotifyAfter = 2000
                                            bulkCopy.DestinationTableName = "tabDestination"
                                            bulkCopy.WriteToServer(DtSet.Tables(0))
                                        End Using
                                    End Using
EdChum
  • 376,765
  • 198
  • 813
  • 562
  • 2
    Can you post some code – Dhawalk Apr 30 '14 at 20:10
  • 1
    ...don't try to process all records at once? – Peter Ritchie Apr 30 '14 at 20:17
  • +1 to @PeterRitchie, but just for interest's sake, how much memory are you using when it crashes? You're limited to [2GB](http://stackoverflow.com/a/200356/622391) if you're running a 32-bit process... – Simon MᶜKenzie May 01 '14 at 00:47
  • You can get more than 2GB in a 32-bit process, but that's not really a valid design decision--you should design something that can accommodate data sets larger than available memory. – Peter Ritchie May 01 '14 at 00:57
  • @PeterRitchie : Thanks for your reply . But Can you please provide me some sample code snippet to divide my excel reading process into chunk of data reading . and making a complete cumulative table at the end . – Mastersidd May 01 '14 at 07:11
  • @Mastersidd If you detail code that exemplifies how you're reading and processing excel files now, maybe someone can offer a change to that so it doesn't load everything in memory all at once. – Peter Ritchie May 01 '14 at 14:00
  • Hi every one i think i got a solution ..but really u guys can't believe how it would get solved i will post you a compete solution by tomorrow After testing the applied approach .. :) – Mastersidd May 01 '14 at 14:53

1 Answers1

0

The correction's which i made are below:

  1. I just change the Data Reading process from excel to Datatable in chunks ( of 100000 records).
  2. On each and every reading process thread i used SQLBulkCopy to through the Data to the destination table.

  3. And after each looping process of uploading (100000 records) clear the rows in datatable objects.