1

For example I have 500 CSV files in my local directory and I want to manipulate data in those 500 files , reading data from each file one by one and writing it on a new CSV file. I know there is a parameter glob in python we use for reading more then one CSV files. Is there any limit on the maximum number of CSV files we can read using Python scripting ?

DYZ
  • 55,249
  • 10
  • 64
  • 93
J.J
  • 11
  • 4
  • 2
    There is no "parameter glob". There is module `glob`. There is no limit on the number of files you can process. There is limit on the number of files you can keep opened at the same time. Do not open all files at once and close them when you do not need them anymore. – DYZ Aug 23 '17 at 02:24
  • The limit is around 10,000 so you should be fine – whackamadoodle3000 Aug 23 '17 at 02:26
  • https://stackoverflow.com/questions/11675301/limitation-to-pythons-glob – whackamadoodle3000 Aug 23 '17 at 02:26
  • Assuming you use Windows, this may help: https://stackoverflow.com/questions/870173/is-there-a-limit-on-number-of-open-files-in-windows – DYZ Aug 23 '17 at 02:29
  • Alright. Thank You guys for a quick response. its very helpful. – J.J Aug 23 '17 at 02:29
  • How big are the files? You don't necessarily need to have them all open at the same time. You could read them into memory one at a time. Say, a list of 500 lists. – Batman Aug 23 '17 at 02:34

1 Answers1

0

You can open and close as many files as you want, however, there is a limit to how many files you can keep open. If you are on Linux, you can use ulimit -n 500 to expand how many files you can open at the same time to 500. More info here

whackamadoodle3000
  • 6,684
  • 4
  • 27
  • 44