0

I am modifying a project that opens a sqlite database through sqlalchemy.

I am running multiple processes of this application, which seems to cause some (locking?) issues. The processes stall waiting for IO (state D in top).

The database is only queried / read, never written to. Unfortunately, the 5GB database file is on a NFS directory, so reading is not instantaneous.

Can I set the database to read-only? Will sqlite/sqlalchemy then avoid the locking/transaction mechanism? Setting isolation_level="READ UNCOMMITTED" seems not to have been enough. It's not clear to me from the documentation how and whether I should set SERIALIZABLE.

Alternatively, I guess I could copy the db to memory first, but it is not clear how to hand the memory database over to sqlalchemy. Should I copy on connect?

j13r
  • 2,576
  • 2
  • 21
  • 28

1 Answers1

0

I was able to solve it by copying the database into memory, by replacing

ENGINE = create_engine('sqlite:///' + DATABASE_FILE)

with:

ENGINE = create_engine('sqlite:///')
import sqlite3
filedb = sqlite3.connect('file:' + DATABASE_FILE + '?mode=ro', uri=True)
print("loading database to memory ...")
filedb.backup(ENGINE.raw_connection().connection)
print("loading database to memory ... done")
filedb.close()

followed by

BASE = declarative_base()
SESSION = sessionmaker(bind=ENGINE)
j13r
  • 2,576
  • 2
  • 21
  • 28