I have a python code that will generate about 28 million files in a directory, but after generating about 6 million files, python will throw an OSError: [Errno 28] No space left on device
when it wants to open a new txt file.
My disk space still has sufficient size to save files, and filesystem is ext4. I also check inodes and it still have 112M to use for each files.
Here is my disk info.
Disk /dev/sdb: 2000GB
Sector size (logical/physical): 512B/4096B
Partition Table: msdos
Disk Flags:
Number Start End Size Type File system Flags
1 1049kB 1999GB 1999GB primary ext4 boot
2 1999GB 2000GB 1023MB extended
5 1999GB 2000GB 1023MB logical linux-swap(v1)
and the remain space
Filesystem Size Used Avail Use% Mounted on
/dev/sdb1 1.8T 150G 1.6T 9% /
and inodes
Filesystem Inodes IUsed IFree IUse% Mounted on
/dev/sdb1 117M 4.5M 112M 4% /
Is there a good way to store large number of files on linux?