I am writing a small suite of tests in Python that must interact with MySQL, and to speed up runtime I am using a multiprocessing Pool
to map across CPU cores.
Each test instantiates an object containing functions to be tested. The object is also responsible for the initialization and lifecycle of its database connection, and it implements a cursor that queues connections:
class Cursor(object):
_cache = Queue.Queue(maxsize=5)
def __init__(self, cursor_type=mysql.cursors.Cursor, **options):
super(Cursor, self).__init__()
try:
conn = self._cache.get_nowait()
except Queue.Empty:
conn = mysql.connect(**options)
else:
conn.ping(True)
self.conn = conn
self.conn.autocommit(False)
self.cursor_type = cursor_type
@classmethod
def clear_cache(cls):
cls._cache = Queue.Queue(maxsize=5)
def __enter__(self):
self.cursor = self.conn.cursor(self.cursor_type)
return self.cursor
def __exit__(self, extype, exvalue, traceback):
if extype is mysql.MySQLError:
self.cursor.rollback()
self.cursor.close()
self.conn.commit()
try:
self._cache.put_nowait(self.conn)
except Queue.Full:
self.conn.close()
When my tests were batched in a Pool
, the process would always fail with this exception:
_mysql_exceptions.OperationalError: (2013, 'Lost connection to MySQL server during query')
I eventually narrowed down the disconnection problem to the queue (not the queue size, however - a queue size of 1 did not fix it). After removing the queuing-related parts of the Cursor
object, multiprocessing is now stable.
I don't understand the why a connection queue would result in MySQL losing its connection. Since each process has a separate test object with its own (largely unused) queue, I wouldn't think it would interfere. Can someone shed some light on this?