I am currently doing an Association Rules project and this is my first time working with SQL Server. I have a Pandas Dataframe with all the results and want to transfer them to an SQL table.
Although, the dataframe has the shape of (1788020, 4) and when running the code it takes too long and it stops at 500ish rows.
Just in case, this is the code I am using:
cursor2 = conn2.cursor()
cursor2.execute("truncate table APriori_test")
for index, row in dataset.iterrows():
cursor2.execute("INSERT INTO APriori_test(antecedents,consequents,support,confidence) values (?,?,?,?)",row.antecedents,row.consequents,row.support,row.confidence)
conn2.commit()
Although, when I insert for example only 1000 rows at a time it runs smoothly with no problems.
How can I automatically set to insert the data in branches of for example 10000 rows each time?
I am open to other suggestions.
Thank you!