0

What is currently most effective way in python to use multithreading/threading/ect and the requests library to make many rest api calls per second?

I've looked through lots of posts talking about the best, simplest, or coolest, way to do threading but I could not find anything. I've very open to change my approach if someone has had good luck using different libraries.

Current setup: using threading, requests, Queue. I read in a CSV and turn it into a list of dictionaries. I then put everything in the Queue and limit the number of threads, but I'm finding that I'm limited to the speed I can do this without dealing with timeouts.

Considerations: I don't want my requests to time out this is more important than speed.

Current Prototype:

def create_account():
    while True:
        account = q.get()
        response = create_acct(account)
        print response['acct_no']
        log.append(response)
        q.task_done()

input_file = 'testaccounts.csv'
concurrent = 15
q = Queue(concurrent * 2)
for i in range(concurrent):
    t = Thread(target=create_account)
    t.daemon = True
    t.start()
try:
    for account in create_account_list:
        q.put(account)
    q.join()
except KeyboardInterrupt:
    sys.exit(1)
Jeff
  • 180
  • 1
  • 11
  • IF you are using requests, you might want to checkout https://github.com/kennethreitz/grequests – C.B. Apr 20 '15 at 17:47
  • It's a bit of an older question in that thread, I was looking for something more current as well as using requests, rather than httplib. I'm willing to check out grequests as well. – Jeff Apr 20 '15 at 17:49
  • 1
    There is an answer and comment addressing that directly within the thread. – C.B. Apr 20 '15 at 17:49

1 Answers1

1

Using gevent + requests, i.e. grequests is a possible solution.

reptilicus
  • 10,290
  • 6
  • 55
  • 79