In this post, Fredrik Lundh explains why memory may not get returned to the system, even though it is not a memory leak.
Near the bottom he explains why (in Python2) range(50*1024*100)
may consume a lot of memory that will not get freed, even after deleting the list. And he mentions that using xrange
is a way to avoid the memory problem in the first place.
Similarly, using an SSDictCursor
instead of a DictCursor
may be a way to avoid the memory problem in your case.
An SSDictCursor
causes the MySQL server to retain the result set on the server side, and the cursor will fetch rows from the result set one-at-a-time only as needed:
import MySQLdb
import MySQLdb.cursors as cursors
conn = MySQLdb.connect(..., cursorclass=cursors.SSDictCursor) #1
cursor = conn.cursor()
cursor.execute('select * from test') #2
for row in cursor: #3
print(row)
conn.close()
- Note the
cursorclass=cursors.SSDictCursor
in the call to connect.
- With a
DictCursor
(or any non-SS cursor), this call to execute
would cause MySQLdb
to
load the entire result set into a Python object (such as a list of dicts).
- With an
SSDictCursor
, MySQLdb
retrieves the rows one at a time.
So this will avoid the memory build-up problem, provided you don't need to hold the entire result set in Python all at once.
Note also that when using an SSCursor
or SSDictCursor
, "No new queries can be issued on the connection until the entire result set has been fetched." Cursors from two different connections can be used concurrently. That may not be an issue for you, but it's something to be aware of.
You might also want to check out oursql, an alternative DB adapter for MySQL. oursql
cursors are server-side cursors which fetch lazily by default.