0

I have a python code that will generate about 28 million files in a directory, but after generating about 6 million files, python will throw an OSError: [Errno 28] No space left on device when it wants to open a new txt file.

My disk space still has sufficient size to save files, and filesystem is ext4. I also check inodes and it still have 112M to use for each files.

Here is my disk info.

Disk /dev/sdb: 2000GB
Sector size (logical/physical): 512B/4096B
Partition Table: msdos
Disk Flags:

Number  Start   End     Size    Type      File system     Flags
 1      1049kB  1999GB  1999GB  primary   ext4            boot
 2      1999GB  2000GB  1023MB  extended
 5      1999GB  2000GB  1023MB  logical   linux-swap(v1)

and the remain space

Filesystem      Size  Used Avail Use% Mounted on
/dev/sdb1       1.8T  150G  1.6T   9% /

and inodes

Filesystem     Inodes IUsed IFree IUse% Mounted on
/dev/sdb1        117M  4.5M  112M    4% /

Is there a good way to store large number of files on linux?

Vincent_Wang
  • 71
  • 1
  • 7
  • See: https://stackoverflow.com/questions/6998083/python-causing-ioerror-errno-28-no-space-left-on-device-results-32766-h . It basically says this error does not necessarily mean you lack free space. Check file names to not exceed 255 characters. – MSH Oct 28 '21 at 07:00
  • @MSH Thanks, I already read this topic and my maximum file name is about 40 characters for each files. – Vincent_Wang Oct 28 '21 at 07:21
  • Can you do a test as: generate files without writing anything in them. Write empty files. Or even better can your share your code so we can test? – MSH Oct 28 '21 at 07:26

1 Answers1

0

After I tried lots of methods e.g. format disk, use gc to clean memory, that the solution is use some folders to disperse files, and it works.

Vincent_Wang
  • 71
  • 1
  • 7