I have a distributed computation framework which uses Celery + RABBITMQ + supervisor. The tasks of my worker involve reading from a database, computing some values and updating the database after the process is done. However, when I try and run multiple workers in a distributed fashion, I keep hitting the error :-
(2014, "Commands out of sync; you can't run this command now")
Can anyone suggest me a way to setup a mutex or lockfile-like mechanism, so that the workers can access the database concurrently.
Any help will be appreciated, thanks, Amit
Edit :-
con = mdb.connect(parameters...)
def reset_table(table_name,con):
with con:
cur = con.cursor(mdb.cursors.DictCursor)
cur.execute("UPDATE " + table_name + " SET active_status = 0 where last_access < (NOW() - INTERVAL 15 MINUTE)")
con.commit()
StackTrace :-
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 238, in trace_task
R = retval = fun(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/celery/app/trace.py", line 416, in __protected_call__
return self.run(*args, **kwargs)
File "/home/elasticsearch/celery_test/tasks.py", line 183, in download_data
auth = get_auth(con)
File "/home/elasticsearch/celery_test/tasks.py", line 94, in get_auth
reset_table("auths",con)
File "/usr/lib/python2.7/dist-packages/MySQLdb/connections.py", line 249, in __exit__
self.rollback()
ProgrammingError: (2014, "Commands out of sync; you can't run this command now")