1

enter image description here

what is the most efficient way as in speed of reading values in a text file (see sample screen shot above) into an excel or .csv file using c#?.

The first row will always have column names and in this sample they are 6 columns but that's not the standard. In another file, the columns could be 4 or 9.

Skipping those blank rows would also be nice.

NOTE that the text files can be as big as 2 or 4 GBs.

Thanks.

Jim Mischel
  • 131,090
  • 20
  • 188
  • 351
StackTrace
  • 9,190
  • 36
  • 114
  • 202
  • 1
    Use one of the already available csv-parser instead of reinventing the wheel. Here is one: http://www.codeproject.com/Articles/9258/A-Fast-CSV-Reader Btw, why don't you use a database instead of fiddling around with 4 GB text files? – Tim Schmelter Mar 14 '13 at 12:03
  • 1
    `Microsoft.VisualBasic.FileIO.TextFieldParser` is part of the framework. Available immeadiately if you can tolerate polluting your references. I don't know how fasted it is. – Jodrell Mar 14 '13 at 12:08
  • @TimSchmelter the files are Log files generated by a monitoring system i have no control over. – StackTrace Mar 14 '13 at 12:09
  • I highly recommend using `Microsoft.VisualBasic.FileIO.TextFieldParser`. It's going to be plenty fast enough. Your limiting factor is going to be disk I/O, not the speed of the CSV parsing. – Jim Mischel Mar 14 '13 at 13:41

1 Answers1

1

You can use a BufferedStream which is a buffer for an existing stream, i.e. FileStream which will help performance.

using (FileStream fs = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
   string line;
   while ((line = sr.ReadLine()) != null)
   {

   }
}

Sources:

Reading large text files with streams in C# http://msdn.microsoft.com/en-us/library/system.io.bufferedstream.aspx

Community
  • 1
  • 1
Darren
  • 68,902
  • 24
  • 138
  • 144