I am try to read 10GB .csv file and import in our db. but when i am read this file via data adapter to fill data set or data table its give out of memory exception.
-
1Please show your work and stack trace as well. Put more details about your problem. Read [FAQ] and [ask] – Soner Gönül Oct 12 '15 at 07:01
-
3@AlexH - A good question needs to be clear, show research effort, and be useful, is this all 3 of those things? – Sayse Oct 12 '15 at 07:02
-
@Sayse I mean in a general point of view: when i begun programming i didn't know the different between reading a whole file or line by line. – Alex H Oct 12 '15 at 07:04
-
Possible duplicate of [FileHelpers throws OutOfMemoryException when parsing large csv file](http://stackoverflow.com/questions/15233454/filehelpers-throws-outofmemoryexception-when-parsing-large-csv-file) – Tony Oct 12 '15 at 07:07
-
If you're importing a .csv file into a DB and if this is an one-time operation, why not use an existing tool? AFAIK, most of them read line-by-line instead of loading it all in memory like you appear to be doing. – WillS Oct 12 '15 at 07:07
-
1@AlexH - Aside from the missing effort in this question, you (and others) have assumed that the op is reading the whole file instead of line by line. That may be so but it isn't clear that that is the case. You are entitled to your opinion on this question, I just disagree – Sayse Oct 12 '15 at 07:10
-
I had a [similar question](http://stackoverflow.com/questions/8988195/how-do-i-read-a-large-file-from-disk-to-database-without-running-out-of-memory) here. Some of the answers, links and examples might prove to be useful to you. – Mr Moose Oct 12 '15 at 07:16
-
Transferring 10GB data doesn't mean loading them one time. Is it possible to leverage existing tools? – qxg Oct 12 '15 at 07:25
2 Answers
As per the MSDN documentation for the SqlDataAdapter
class:
Represents a set of data commands and a database connection that are used to fill the DataSet and update a SQL Server database. This class cannot be inherited.
If we where to look at the DataSet
documentation:
Represents an in-memory cache of data.
Thus this means that you need to have enough memory for the data set to be stored in it. The question does not provide detail on how is the file actually read, but for this to work you would need to essentially:
Iterate over the file 1 row at a time, do not load the entire file, say as a string in memory. You can use a
TextReader
to read.You would then use an
SqlConnection
to write into the database one line at a time.
This should allow you to essentially keep pointers to where the data is and where it needs to go, thus it reduces the amount of data you need to store in memory.

- 51,780
- 5
- 72
- 96
You may read your file line by line and import to database. For parsing CSV use CsvHelper.

- 5,894
- 3
- 17
- 23
-
2Unreliable due to http://stackoverflow.com/questions/566052/can-you-encode-cr-lf-in-into-csv-files – ivan_pozdeev Oct 12 '15 at 15:57