I am writing to a file using python. The script suddenly stops running and throws an 'IOError: [Errno 27] File too large' Is there a limit on the size of the file that you are allowed to create using a program? Has anyone else faced this issue? The file size was close to 4.3Gb(it is a bit big) when it stopped.
3 Answers
This Python bug report indicates that the OS is the source of this error message, not Python.
Since you are writing to a FAT partition, and the maximum file size limit is 4GB for FAT 32 (LinuxFilesystemsExplained) this is most likely the cause of your problem. Running your program on a system or partition with a different file system would tell you for sure.

- 138,105
- 33
- 200
- 191
-
-
There is no file size limit on my account, I check using ulimit command as specified [here](http://www.cyberciti.biz/faq/file-size-limit-exceeded-error-under-linux-and-solution/). file size says unlimited – viper Jun 08 '12 at 21:30
-
one possible explanation is in the comment section(2nd comment by username 'Chris') of that [link](http://www.cyberciti.biz/faq/file-size-limit-exceeded-error-under-linux-and-solution/) it says that fat fs only supports upto 4gb file size. the partition I am writing to is fat. – viper Jun 08 '12 at 21:37
-
@viper yes, that seems most likely the cause of your problem then. Can you test your program on a system with a different file system? That way you'd know for sure if you are running into this limit. – Levon Jun 08 '12 at 21:39
-
2
I also got this error when I had too many files in one directory. I had 64435
files in a directory, each with 10 digits + '.json' in their names, and any subsequent attempts to write new files to the directory threw errors (e.g.) OSError: [Errno 27] File too large: 'ngrams/0/0/0/0000029503.json'

- 25,611
- 17
- 169
- 224
When files get too large, addressing becomes an issue. Typically you get 32 bits which translates to a maximum size of about 4 gb.

- 35,812
- 14
- 73
- 140