Due to complications with how Django and SQLAlchemy deal with JSON data, I've had to create a system that uses models from both of them as well as using standard SQLAlchemy.
The problem I'm having is that when I update information within a table via the table.update()
method there's quite a considerable delay until my SQLAlchemy table model picks up the change.
Is there a way to force the model to update?
My code is along these lines:
# Database Connection
engine = create_engine('mysql+pymysql://'+dbusername+':'+dbuserpass+dbserver+dbname,
pool_recycle=3600, echo=False)
con = scoped_session(sessionmaker(autocommit=True,autoflush=False,bind=engine))
Session = sessionmaker(bind=engine)
sess = Session()
meta = MetaData(engine)
insp = inspect(engine)
Base = declarative_base()
con.close()
engine.dispose()
# sqlalchemy table model
class ContactsTable(Base):
__tablename__ = 'contacts_tbl'
db_id = Column(Integer, primary_key=True)
per_contact_id = Column(JSON)
createdDateTime = Column(JSON)
lastModifiedDateTime = Column(JSON)
distlists = Column(JSON)
# Theres a lot of code missing here you can see the basics of what I'm doing, adding data and then reading
def add_to_dist(contact,dist,tbl=contacts_tbl):
con.execute(tbl.update().values(distributionLists=dists).where(tbl.c.per_contact_id==contact))
def get_dist_members(name):
data = sess.query(ContactsTable).filter(ContactsTable.distributionLists.contains(name)).all()
Everything works. It's just that the query data is out of date and seems to take anywhere up to 10 minutes to refresh. This is annoying as it's running through a web page that displays the data. It really needs to reflect the changes instantly.
If in fact I'm doing this whole thing incorrectly then feel free to school me!