When I'm trying to commit changes into table it takes a lot of time (about 300 seconds per each 1000 rows).
Model class:
class Quotes(base):
__tablename__ = 'quotes'
id = Column(INTEGER, primary_key=True, autoincrement=True, index=True)
ticker = Column(TEXT)
data = Column(ARRAY(FLOAT))
def create():
Session = sessionmaker(db)
session = Session()
return session
def add(session, ticker=None, data=None):
new_row = Quotes(ticker=ticker, data=data)
session.add(new_row)
def commit(session):
t0 = time.time()
session.commit()
print("SQLAlchemy ORM commit(): Total time " + str(time.time() - t0) + " secs")
Code looks like:
for index in range(len(quotes['timestamp'])):
temp = [
int(quotes['timestamp'][index]), quotes['open'][index],
quotes['high'][index], quotes['low'][index],
quotes['close'][index]
]
Quotes.add(session, ticker, temp)
counter += 1
if counter % 1000 == 0:
Quotes.commit(session)
I have about 100k rows. At the start I created a session, then I'm adding new row into a session on each iteration of for loop. I call .commit() method each 1000 row, and it takes about 300 seconds. It seems too slow for me. Is it okay or there is another (correct) way to insert such a large data set into database.