I'm using Python 3.6, and have written a file that takes all csv files in a folder and creates a table for each one in a database.db file using sqlite3. Then using sqlite3 database.db
I can run sqlite commands and query the tables. It works okay, but for large csv files it's a bit slow (inserting the data into the tables, that is) . Sqlite has its own command for importing csv files into a table, which looks like:
.mode csv
.import FILE_PATH TABLE_NAME
This seems to be much faster at creating the tables and inserting the data than what I have written. I was wondering if there's any way to write python code that would write the above commands directly into the sqlite command line, so it could be done for multiple csv files automatically without having to explicitly type the .import
command for each one? Or something to this effect?