I have a program that produces large number of small files (say, 10,000 files). After they are created, another script accesses them and processes one by one.
Questions:
- does it matter, in terms of performance, how the files are organized (all in one directory or in multiple directories)
- if so, then what is the optimal number of directories and files per dir?
I run Debian with ext4 file system
Related