0

I have a Windows service that is retrieving a few selected files from a directory that contains millions of other files. The service runs out of memory every hour or so. This never happened when there were a few thousand, so it feels like some resource is not being properly disposed of. But to my eyes the stream is being disposed of properly.

using (FileStream fs = File.Open(fileName, FileMode.Open, FileAccess.Read))
{
  //deserializes the file with a binary formatter
}

The directory itself is marked as compressed, so I also wonder whether this is contributing to the problem.

Can anyone explain the cause of this apparent memory leak?

John Saunders
  • 160,644
  • 26
  • 247
  • 397
HH.
  • 129
  • 6

5 Answers5

1

You should run a memory profiler tool against your program to see what is actually taking up all that memory. dotTrace, ANTS, and MemProfiler are all great products. They will tell you if you have managed or unmanaged objects taking up your memory. If the compression is an issue, it will probably be unmanaged memory, if the problem is with your disposables, then you'll see managed memory leaks.

Alternatively, you can read this question (and answer) to get some background on having a million files in a folder with NTFS.

Community
  • 1
  • 1
Garo Yeriazarian
  • 2,533
  • 18
  • 29
1

I agree that the problem probably has little to do with your .net code and more to do with the underlying filesystem. This question may be helpful.

Community
  • 1
  • 1
etoisarobot
  • 7,684
  • 15
  • 54
  • 83
1

It is almost certainly because the folder is compressed with millions of files. There is likely a table of sorts that has to be looked through to decompress the data, and that table would grow with the number of files in the folder. The act of decompressing the file is what is causing your problem.

smelch
  • 2,483
  • 1
  • 18
  • 19
0

Are you looking in the right place for your issue?

With a known file name, there should be no latency other than that incurred from the file system managing a directory like this. You should not incur memory issues or undue latency in your code.

Are you listing the directory or searching for the file(s) in your service code?

Sky Sanders
  • 36,396
  • 8
  • 69
  • 90
  • I have the file name, so there's no searching. I too thought the leak may be elsewhere but this service/code is doing very little, and the fact that the dir is compressed makes me suspicious. – HH. Jul 22 '10 at 18:52
  • @HH, NTFS compression is handled by the file system, not your service. This cannot have any impact on memory usage. You may be looking in the wrong place for your memory problem. – Sky Sanders Jul 22 '10 at 19:12
0

A "directory that contains millions of other files" does not sound healthy to me. It might be that there is no memory leak, just that the operation requires more memory than is available ...

Andrei
  • 4,880
  • 23
  • 30
  • But my question revolves around "that operation". Why would retrieving a file by path/name be affected by the dir size it's located in? – HH. Jul 22 '10 at 18:55