I'm trying to load data from a local Postgres database as quickly as possible, and it appears that the most performant python package is asyncpg. My code is synchronous, and I repeatedly need to load chunks of data. I'm not interested in having the async
keyword propagate to every function I've written, so I'm trying to wrap the async code in a synchronous function.
The code below works, but is incredibly ugly:
def connect_to_postgres(user, password, database, host):
async def wrapped():
return await asyncpg.connect(user=keys['user'], password=keys['password'],
database='markets', host='127.0.0.1')
loop = asyncio.get_event_loop()
db_connection = loop.run_until_complete(wrapped())
return db_connection
db_connection = connect_to_postgres(keys['user'], keys['password'],
'db', '127.0.0.1')
def fetch_from_postgres(query, db_connection):
async def wrapped():
return await db_connection.fetch(query)
loop = asyncio.get_event_loop()
values = loop.run_until_complete(wrapped())
return values
fetch_from_postgres("SELECT * from db LIMIT 5", db_connection)
In Julia I would do something like
f() = @async 5
g() = fetch(f())
g()
But in Python it seems I have to do the rather clunky,
async def f():
return 5
def g():
loop = asyncio.get_event_loop()
return loop.run_until_complete(f())
Just wondering if there's a better way?
Edit: the latter python example can of course be written using
def fetch(x):
loop = asyncio.get_event_loop()
return loop.run_until_complete(x)
Although, still need to create an async wrapped function unless I'm missing something.
Edit 2: I do care about performance, but wish to use a synchronous programing approach. asyncpg is 3x faster than psycopg2 as its core implementation is in Cython rather than Python, this is explained in more detail at https://magic.io/blog/asyncpg-1m-rows-from-postgres-to-python/. Hence my desire to wrap this asynchronous code.
Edit 3: another way of putting this question is what's the best way to avoid "what color is your function" in python?