I have a program and as it's done now, it has a data directory with something like 10-30K files in it and it's starting to cause problems. Should I expect that to cause problems and my only solution to tweak my file structure or does that indicate other problems?
Asked
Active
Viewed 361 times
2
-
@BCSD Did my answer, answer your question? if so could you please accept. thanks – James Campbell Apr 12 '10 at 14:30
-
@Vecdid, It looks like it's the best I'm going to get... – BCS Apr 12 '10 at 15:33
1 Answers
7
Optimize NTFS hard disk performance in Windows servers
How to Optimize NTFS Performance
When Windows NT, 2000 or XP accesses a directory on an NTFS volume, it updates the LastAccess time stamp on each directory it detects. Therefore, if there are a large number of directories, this can affect performance.

Community
- 1
- 1

James Campbell
- 5,057
- 2
- 35
- 54
-
That last bit is good to know as one "fix" (in effect, sharding) would result in large numbers of directories (as opposed to the current, huge numbers of just files). From your point, my fix might be even worse. – BCS Apr 08 '10 at 06:55
-
The link gives no citation or context for that claim. I strongly suspect it means that each subdirectory is accessed [and therefore last access date is updated] when the directory _is opened in Explorer_, not when it is accessed by any process – Random832 Oct 12 '11 at 20:36