I encountered the following error after inserting 95 rows of video metadata out of 4000. sqlalchemy.exc.TimeoutError: QueuePool limit of size 5 overflow 10 reached, connection timed out, timeout 30 (Background on this error at: http://sqlalche.me/e/3o7r).
Base
engine = db.create_engine(connect_string)
con = engine.connect()
_SessionFactory = sessionmaker(bind=engine)
Base = declarative_base()
def session_factory():
Base.metadata.create_all(engine)
return _SessionFactory()
Video object
class Video(Base):
__tablename__ = 'video'
video_to_person = Table('video_to_person', Base.metadata,
Column('video_id', String, ForeignKey('video.vid')),
Column('person_id', Integer, ForeignKey('person.id'))
)
_vid = Column("vid",String, primary_key=True)
_webpage_url = Column("webpage_url", String)
_upload_date = Column("upload_date", Date)
_uploader = Column("uploader", String)
_view_count = Column("view_count", DECIMAL)
_like_count = Column("like_count", DECIMAL)
_dislike_count = Column("dislike_count", DECIMAL)
_format = Column("format", String)
_duration = Column("duration", DECIMAL)
_title = Column("title", String)
_description = Column("description", String)
persons = relationship("Person", secondary=video_to_person)
Video repository:
class VideoRepository():
def create_video(self, vid: Video):
session = session_factory()
session.add(vid)
session.commit()
session.close()
How can I improve the connection management?
UPDATE: Thanks for the responses so far. One of the challenges is that all my model classes e.g. (class Video) inherit from Base. Base always creates a new engine object. I will look into further refactoring.