3

So this is my little block of code. It's just an asyncio loop that sends 10 post requests out to Twilio:

import time
import aiohttp
import asyncio


async def asynchronous():
    tasks = [f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo'),
             f('NumberFrom', 'NumberTo', 'asyncio imo')]
    await asyncio.gather(*tasks)


async def f(NumberFrom, NumberTo, MessageBody):
    try:
        print('Sent at %s' % time.time())
        async with aiohttp.ClientSession() as session:
            await session.post('https://api.twilio.com/2010-04-01/Accounts/AuthPass/Messages.json',
                                data={'From': NumberFrom, 'To': NumberTo, 'Body': MessageBody}, 
                                auth=aiohttp.BasicAuth(login='AuthUser', password='AuthPass'))
        print('Done at at %s' % time.time())
    except Exception as err:
        print('Error encountered at %s' % time.time())


asyncio.run(asynchronous())

Before anyone asks, I have a paid account with Twilio and don't freeload or spam them. I am not trying to bombard anyone with SMS messages. I'm just required to send a burst of messages out occasionally, and each message needs to be sent to a different number at more or less the same time.

Currently, I am doing this with the threading module. I start up an individual thread for each message. This is fine with a few numbers, but it gets inefficient when you need to open more than a few threads. I have to open 20 threads each time I do this, and am after a more efficient way of asynchronously sending 20 post requests than threads.

This is the preformance I am getting right now with asyncio:

>>> asyncio.run(asynchronous())
0.0
Sent at 1553142004.4640338
Sent at 1553142004.5059218
Sent at 1553142004.5119061
Sent at 1553142004.5178897
Sent at 1553142004.5238738
Sent at 1553142004.5288606
Sent at 1553142004.5348446
Sent at 1553142004.5388453
Sent at 1553142004.5448182
Sent at 1553142004.5488071
Done at 1553142004.9834092
Done at 1553142004.9913745
Done at 1553142005.0013483
Done at 1553142005.0153105
Done at 1553142005.0264556
Done at 1553142005.0342588
Done at 1553142005.0472543
Done at 1553142005.0581958
Done at 1553142005.066205
Done at 1553142005.0731542
>>> 

I average about 100 post requests every second. For some reason I thought asyncio would be faster than this. I've read articles of Python being capable of 1,000,000 requests per second. I'm not expecting that, I just figured I'd be able to get an order of magnitude more of performance out of asyncio.

Is there an obvious error in my code that is reducing efficiency with asyncio or something? Or is this just the peak that asyncio can do? I am not new to Python, but I am new to asyncio so I have no clue what I am doing here. Please explain anything obvious.

For reference I am running a 4 Core 3.2GHz intel i7 processor and this script was the only thing running at the time. I know my CPU is not the bottleneck.

My internet spikes to about 250Kbps when running this, but that is nowhere near my ISP cap of 3.5Mbps. I know my internet is not the bottleneck.

I am running this script in Python 3.7.2 in the IDLE shell.

1 Answers1

2

You should pass custom connector to session:

connector = aiohttp.TCPConnector(limit=None)
async with aiohttp.ClientSession(connector=connector) as session:
    # ...

Reasons for that explained here with more details.


Note also, that article about making million requests doesn't promise "per second".

You probably confuse it with article about handling requests on server-side using Japronto which is something completely different (not mentioning this article has its own issues).


Upd:

There's always be overhead related to preparing request. You can try to use single session to spare some time:

import time
import aiohttp
import asyncio


async def asynchronous():
    async with aiohttp.ClientSession() as session:
        tasks = [f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session),
                 f('NumberFrom', 'NumberTo', 'asyncio imo', session)]
        await asyncio.gather(*tasks)


async def f(NumberFrom, NumberTo, MessageBody, session):
    try:
        print('Sent at %s' % time.time())
        await session.get('http://httpbin.org/delay/1')
        print('Done at at %s' % time.time())
    except Exception as err:
        print('Error encountered at %s' % time.time())


asyncio.run(asynchronous())
Mikhail Gerasimov
  • 36,989
  • 16
  • 116
  • 159
  • My script only makes 5 post requests. That page you linked talks about how by default aiohttp only allows 100 simultaneous connections. I don't think that's my bottleneck, I'm only making 10 connections at a time. – Arbi Bushka Mar 21 '19 at 14:56
  • @ArbiBushka I reread your question. So right now you're sending 10 requests and all of them are finished after 1 second, am I right? What other result did you expected then? – Mikhail Gerasimov Mar 21 '19 at 15:55
  • I'm sending 10 requests and they are all sent in about 0.10 seconds. A response is returned in under a second. I guess I'm just looking for more throughput. I though asyncio would be able to send a few thousand requests per second. – Arbi Bushka Mar 21 '19 at 18:25
  • @ArbiBushka asyncio is able to *start* multiple requests parallely. Each started request however needs time to travel trough web wires to server and then back to client. Even if it's optical fiber wire between server and client it'll still take at least length_of_wire*speed_of_light time (+ some Python overhead). You can't avoid it anyhow. – Mikhail Gerasimov Mar 21 '19 at 18:31
  • I know that web requests take time. I thought I would be able to send more requests per second. I'm asking about my request throughput, not my speed per request. – Arbi Bushka Mar 21 '19 at 18:53
  • @ArbiBushka I updated answer. You can also try to use [uvloop](https://github.com/MagicStack/uvloop). There isn't much more can be done: Python and aiohttp have some overhead for preparing request. – Mikhail Gerasimov Mar 21 '19 at 19:07
  • I came across uvloop, it isn't available on windows. – Arbi Bushka Mar 22 '19 at 15:24
  • Passing the same aiohttp session to each function helped. Also, I was running the script in Python's default ide, IDLE. Running in Pycharm or through command prompt boosted preformance a lot. I dont recall what each modification did individually, but I am now safely above 1000 post requests per second. – Arbi Bushka Mar 26 '19 at 00:47