We have large excel files (up to 100k rows with up to 150 columns, around 30% of the columns containing long text objects. The file size varries from 10mb to 60mb).
I need to load these to our Oracle database in the most efficient way.
I thought of loading each Excel file with xlrd, and then creating an INSERT statement for every Excel row, and execute my SQL with cx_Oracle.
My questions are -
Should I create an INSERT statement for every single row, and then execute it (meaning plenty of cursor.execute(sql) calls), or should I have one massive string with all my inserts (separated by a semicolon), which means I have only one cursor.execute(sql) call?
Assuming the whole operation takes lots of time, is there a possibility I'll get a timeout from the database? I remember having something similar a while ago, I was running a very long query via cx_Oracle and after a few hours there was a timeout and the connection was lost. What can be done about this?
Is the concept any good? Perhaps there's a better way to implement this?
Thanks!