I have a python written a python module to query a database and read this into a dataframe. Some of these queries are quite big and are causing the module to exit. i.e. I get:
Exited
printed to the screen. Digging a bit deeper, I find
Memory cgroup out of memory: Kill process
So it's running out of memory - my question is how do I capture that kill signal so I can print a useful error message e.g. you need to request more resources to run this command... Currently I have:
import signal
import pandas as pd
kill_now = False
def exit_gracefully(signum, frame, ):
kill_now = True
signal.signal(signal.SIGINT, exit_gracefully)
signal.signal(signal.SIGTERM, exit_gracefully)
sql_reader = pd.read(query, conn, chunksize=1000)
table_data = []
while not kill_now:
for data in sql_reader:
table_data.append(data)
break
if kill_now:
print("ran out of memory...")
But this doesn't catch the "Killed" signal