I have a text file which contains a sequence of price Data. The problem could match any long history of historical data such as Temperature, Air Humidity, Prices, Logfiles, ...
The Head of my history file looks like the following:
If I want to read and process a file too large for memory, I would normally choose the following code:
using (FileStream fs = File.Open(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs))
{
string line;
while ((line = sr.ReadLine()) != null)
{
// Process Data
}
}
In my case a record is created every 1000ms. The most recent data is at the end of the file. The issue arises when trying to process the most recent data.
Example:
I want to generate an average of the last 30 days.
It would be most efficient to start at the end of the file and move towards the beginning until the X days threshold is met.
The sample code above would read through the whole file which is barely usable in this scenario. A worst-case every time I need to update recent data indicators.
This issue of course applies to any operation where you want to process the last x elements.
Is there a functionality to read from end to start of the file?