1

I need to take the contents of a user's directory and send them in a request. Unfortunately I cannot modify the service i'm making the request to, and so I CANNOT zip all the files and send that, I must send all the files.

There's a limit on the total size of the files, but not on the # of files. Unfortunately once I try and open too many, Python will error out with: [Errno 24] Too many open files. Here's my current code:

files_to_send = []
files_to_close = []
for file_path in all_files:
    file_obj = open(file_path, "rb")
    files_to_send.append(("files", (file_path, file_obj)))
    files_to_close.append(file_obj)

requests.post(url, files=files_to_send)

for file_to_close in files_to_close:
    file_to_close.close()

Is there any way to get around this open file limit given my circumstances?

user3715648
  • 1,498
  • 3
  • 16
  • 25

1 Answers1

1

Can you send the files one at a time? Or maybe send them in batches? You can increase the number of allowed open files here:

IOError: [Errno 24] Too many open files:

But in general that's finicky and not recommended.

You can send your requests in batches, ala:

BATCH_SIZE = 10 # Just an example batch size
while len(all_files) > 0:
    files_to_send = []
    while len(files_to_send) < BATCH_SIZE:
        files_to_send.append(all_files.pop())
        file_obj = open(file_path, "rb")
    requests.post(url, files=files_to_send)
    for f in files_to_send:
        f.close()

This will send files in batches of 10 at a time. If one at a time works:

for file in all_files:
    with open(file, "rb") as outfile:
        requests.post(url, files=[outfile])

It's generally bad practice to open too many files at once.

Aaron Krajeski
  • 757
  • 5
  • 18