I'm developing in .Net 3.5 and I have a little big issue that if we resolve, we can help other people with same problem.
I'm having issues running out of memory. I need to read a big text file, around 500mb and 13 millions of lines, each line must be splitted with ; to get the values of the line (around 7 values per line) and load all the values into a DataTable.
I don't know how can I read and load it, without get all the memory of the system.
My PC have 8gb of ram and it gets full.
Thanks
DataTable dt = new DataTable();
dt.Columns.Add("Months");
dt.Columns.Add("WHS");
dt.Columns.Add("BRICK");
dt.Columns.Add("DAY");
dt.Columns.Add("SALES TYPE");
dt.Columns.Add("FCC");
dt.Columns.Add("UNITS");
String line;
using (FileStream fs = File.Open("path", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
while ((line = sr.ReadLine()) != null)
{
string[] parts = line.Split(';');
dt.Rows.Add(parts[0], parts[1], parts[2], parts[3], parts[4], parts[5], parts[6]);
}
dataGridView1.DataSource = dt;
}
TEMPORAL SOLUTION
You need to clear de DataTable with a counter (I do all of this in a BackGroundWorker), for example:
int counter = 0;
while ((line = sr.ReadLine()) != null)
{
string[] parts = line.Split(';');
dt.Rows.Add(parts[0], parts[1], parts[2], parts[3], parts[4], parts[5], parts[6]);
counter++;
dataGridView1.DataSource = dt;
if(counter >= 500000)
{
dt.Clear();
counter = 0;
}
}