I'd like to be able to kick off a big batch of jobs in the background without using bash scripts and keep working on the same kernel. Is this possible? I am open to architecture changes, but the end user of my library is likely not very sophisticated.
[1] create_batch = my_batch.create(**batch_input)
[2] run_batch = start_async_process(
# real python... not bash
sleep(9999)
#my_batch.execute_jobs()
)
[3] print("i can still to do stuff while that runs!")
[4] my_batch.get_status()
- The
[]
represent ipython cells. - python 3.7.6 from within JupyterLab