1

Due to complications with how Django and SQLAlchemy deal with JSON data, I've had to create a system that uses models from both of them as well as using standard SQLAlchemy.

The problem I'm having is that when I update information within a table via the table.update() method there's quite a considerable delay until my SQLAlchemy table model picks up the change.

Is there a way to force the model to update?

My code is along these lines:

# Database Connection
engine = create_engine('mysql+pymysql://'+dbusername+':'+dbuserpass+dbserver+dbname, 
pool_recycle=3600, echo=False)
con = scoped_session(sessionmaker(autocommit=True,autoflush=False,bind=engine))
Session = sessionmaker(bind=engine)
sess = Session()
meta = MetaData(engine)
insp = inspect(engine)
Base = declarative_base()
con.close()
engine.dispose()

# sqlalchemy table model
class ContactsTable(Base):
    __tablename__ = 'contacts_tbl'
    db_id = Column(Integer, primary_key=True)
    per_contact_id = Column(JSON)
    createdDateTime = Column(JSON)
    lastModifiedDateTime = Column(JSON)
    distlists = Column(JSON)

# Theres a lot of code missing here you can see the basics of what I'm doing, adding data and then reading

def add_to_dist(contact,dist,tbl=contacts_tbl):
    con.execute(tbl.update().values(distributionLists=dists).where(tbl.c.per_contact_id==contact))

def get_dist_members(name):
    data = sess.query(ContactsTable).filter(ContactsTable.distributionLists.contains(name)).all()

Everything works. It's just that the query data is out of date and seems to take anywhere up to 10 minutes to refresh. This is annoying as it's running through a web page that displays the data. It really needs to reflect the changes instantly.

If in fact I'm doing this whole thing incorrectly then feel free to school me!

scottapotamus
  • 548
  • 3
  • 18
  • 3
    It sounds like you're keeping a transaction open and then after ~10 minutes something triggers a new transaction – maybe the connection is recycled or such. Since you're using MySQL you're probably using its default transaction isolation level REPEATABLE READ (nomen est omen), and so keeping a transaction open means you're reading from the same established snapshot. In addition a `Session` in SQLAlchemy is oblivious to changes done using Core (`execute(tbl.update()...)`), and you will have to manually expire affected objects. – Ilja Everilä Nov 15 '19 at 07:37
  • 1
    Related reading: https://stackoverflow.com/questions/10210080/how-to-disable-sqlalchemy-caching, https://stackoverflow.com/questions/56640429/using-session-query-to-read-uncommitted-data-in-sqlalchemy – Ilja Everilä Nov 15 '19 at 07:43

1 Answers1

0

Solved it by adding isolation_level="READ UNCOMMITTED" to the engine:

engine = create_engine('mysql+pymysql://'+dbusername+':'+dbuserpass+dbserver+dbname, pool_recycle=3600, echo=False, isolation_level="READ UNCOMMITTED")

Thanks to Ilja for pointing me in the right direction.

scottapotamus
  • 548
  • 3
  • 18