1

I used threading module to open multiple sqlloader sessions and it worked fine. Having troubles achieving same degree of parallelism using asyncio module (coroutines).

This code always loads sequentially in Python 3.5:

import asyncio

async def load_data(filename):

        loadConf=('sqlldr SKIP=%s %s userid=%s DATA=%s control=%s LOG=%s.log BAD=%s.bad DISCARD=/dev/null'  % (...)).split(' ')

        p = Popen(loadConf, stdin=PIPE, stdout=PIPE, stderr=STDOUT, shell=False, env=os.environ)
        output, err =  p.communicate(pwd.encode())  
        status=p.wait()

async def main():
    await asyncio.wait([load_data(file1),load_data(file2)])


if __name__ == "__main__":  
  loop = asyncio.get_event_loop()
  loop.run_until_complete(main())   
olekb
  • 638
  • 1
  • 9
  • 28

1 Answers1

2

Yes it is possible, but you have to use asyncio.create_subprocess_shell instead of subprocess.Popen since the latter doesn't know anything about the event loop and simply blocks inside your load_data until complete. Here is a relevant example.

Gosha F
  • 427
  • 3
  • 9