2

I'm trying to find a memory efficient way to do a paged query to test for an empty collection, but can't seem to figure out how to go about it efficiently on a large database. The table layout uses an Association Object with bi-directional backrefs. It is very similar to the documentation.

class Association(Base):
    __tablename__ = 'Association'
    assoc_id = Column(Integer, primary_key=True, nullable=False, unique=True)
    member_id = Column(Integer, ForeignKey('Member.id'))
    chunk_id = Column(Integer, ForeignKey('Chunk.id'))
    extra = Column(Text)
    chunk = relationship("Chunk", backref=backref("assoc", lazy="dynamic"))

class Member(Base):
    __tablename__ = 'Member'
    id = Column(Integer, primary_key=True, nullable=False, unique=True)
    assocs = relationship("Association", backref="member", cascade="all, delete", lazy="dynamic")

class Chunk(Base):
    __tablename__ = 'Chunk'
    id = Column(Integer, primary_key=True, nullable=False, unique=True)
    name = Column(Text, unique=True)

If the member is deleted, it will cascade and delete the member's associations. However, the chunk objects will be orphaned in the database. To delete the orphaned chunks, I can test for an empty collection using a query like this:

session.query(Chunk).filter(~Chunk.assoc.any())

and then delete the chunks with:

query.delete(synchronize_session=False)

However, if the association and chunk tables are large it seems the query or subquery loads up everything and the memory skyrockets.

I've seen the concept of using a paged query to limit the memory usage of standard queries here:

def page_query(q, count=1000):
    offset = 0
    while True:
        r = False
        for elem in q.limit(count).offset(offset):
            r = True
            yield elem
        offset += count
        if not r:
            break

for chunk in page_query(Session.query(Chunk)):
    print chunk.name

However this doesn't appear to work with the empty collection query as the memory usage is still high. Is there a way to do a paged query for an empty collection like this?

Community
  • 1
  • 1

1 Answers1

1

I figured out a couple of things were missing here. The query for the empty chunks appears to be mostly OK. The memory usage spike I was seeing was from a query a few lines earlier in the code when the actual member itself was deleted.

member = session.query(Member).filter(Member.name == membername).one()
session.delete(member)

According to the documentation, the session (by default) can only delete objects that are loaded into the session / memory. When the member is deleted, it will load all of it's associations in order to delete them per the cascade rules. What needed to happen is that the association loading had to be bypassed by using passive deletes.

I added:

passive_deletes=True

to the association relationship of the Member class and:

ondelete='CASCADE'

to the member_id foreign key of the Association class. I'm using SQLite3 and added foreign key support with an engine connect event per the docs.

In regards to the orphan chunks, instead of doing a bulk delete of chunks with the query.delete method. I used a page query that doesn't include an offset and deleted the chunks from the session in a loop as shown below. So far I don't seem to have any memory spikes:

def page_query(q):
    while True:
        r = False
        for elem in q.limit(1000):
            r = True
            yield elem
        if not r:
            break

for chunk in page_query(query):
    # Do something with the chunk if needed
    session.delete(chunk)
session.commit()

To make a long story short, it seemed to help greatly to use passive_deletes=True when deleting a parent object whose has a large collection. The page query also appears to work well in this situation only that I had to take out the offset since the chunks were being removed from the session inline.