1

I need to write a realtime monitoring application, which stores information from log-files into a gridview. + that they can be process further from other features/functions in my application.

I've running a Application on my server, which creates entries into a logfile at odd times. (csv-formatted)
These files can sometimes have up to 100mbs. I don't always want to scan/examine the whole file, because I know that new etries will always be added to the bottom of the file.

How should I construct the update-function (in my realtime monitoring application)?

  • Should I compare the filesize, and if the filesize has changed? -> examine/update
  • Should I examine the files every 1minute?
  • Should I check for "Date-modified"?

By the way, a full scan is time-consuming, can this be reduced by only scanning for new entries?

So my question is: What is the best way to solve it?!

MrMAG
  • 1,194
  • 1
  • 11
  • 34
  • 1
    You could try using [`FileSystemWatcher`](http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx) for notifications when file changes. I don't know if it would fit your use-case though. – Patryk Ćwiek Jun 13 '13 at 15:02
  • I think one important question for you to answer is, how often is the file modified. If it is frequently, then time-based check would be more appropriate than a FileSystemWatcher. If it is infrequently, then you can do it with FileSystemWatcher every time the file is modified. – Tombala Jun 13 '13 at 15:19
  • During the day it's (very) frequently, it is just at odd-times. The entries per minute/hour are widely different. – MrMAG Jun 13 '13 at 15:54
  • If there is no reason why the log cannot be in the database, my suggestion would be to move it to the database. It will make processing much easier. If not, then periodic check of the file would be the right way, IMO. I would keep track somehow the last line number you read, then you can fast forward to that position in file and start reading from there. – Tombala Jun 13 '13 at 18:39
  • 1
    Also, see [this](http://stackoverflow.com/questions/4273699/how-to-read-a-large-1-gb-txt-file-in-net) for reading large files. You have two options: MemoryMappedFile or StreamReader. One lets you read random positions in a file without scanning through it. The other is, well, the opposite. :) – Tombala Jun 13 '13 at 18:45

1 Answers1

3

I use baretail. It works well constantly updating, and as long as your logs have meaningful statements in them, you can easily track the changes that are occurring.

Edmund Covington
  • 521
  • 6
  • 17
  • Baretail looks great, but I need to process further the data. I don't just want to the monitor. That's why I wan't some C#-code to deal with. Thanks anyway! – MrMAG Jun 13 '13 at 15:14
  • You could use NLogger which is a .net library to log your data, which can include a timestamp. If you are looking for particular items give your log entries keys which make them easy to find. I have found that is usually enough to create manageable logs. – Edmund Covington Jun 13 '13 at 15:19