1

I'm using threading module with Python 2.7.10 on OS X 10.11.

I want to run 50000 http requests simultaneously:

def save_data(device):
    data = {
        'id': device['id'],
        'token': device['token'],
    }

    request = requests.post('http://localhost/save_data', data=data)
    return request.status_code == 200


def save_data_thread(device, event):
    event.wait()
    result = save_data(device)
    print 'Device id %d result: %r' % (device['id'], result)


devices = pickle.load(open('devices', 'r'))
threads = []

start_event = threading.Event()

print 'Creating threads...'

for i in range(len(devices)):
    thread = threading.Thread(target=save_data_thread, args=(devices[i], start_event))
    threads.append(thread)

print 'Starting threads...'

i = 0

for thread in threads:
    thread.start()

    i += 1
    print 'Started %d threads' % i

start_event.set()

print 'Joining threads...'

for thread in threads:
    thread.join()

print 'Working...'

But I'm getting the exception:

thread.error: can't start new thread

while starting 2048th thread. I have enough free RAM.

Is it possible to increase max number of threads?

artem
  • 16,382
  • 34
  • 113
  • 189

1 Answers1

2

If you need to perform multiple requests in parallel, you might consider to use framework with asynchronous approach, e.g.: Twisted

Python GIL only allows one Python thread to run at a time.

Number of allowed threads is typically limited by the OS.

Edit:

If you want to stick with threads approach, you might want to parrallelize your job in slightly different manner. Check this post for a solution to a similar problem.

Community
  • 1
  • 1
Maciej Lach
  • 1,622
  • 3
  • 20
  • 27
  • 1
    Also if working in Python 3.4.3 or the upcoming 3.5.0 (due 2015-09-13) consider the standard library's `asyncio` package. – chucksmash Aug 12 '15 at 18:24