0

I'm using Python 3.6, and have written a file that takes all csv files in a folder and creates a table for each one in a database.db file using sqlite3. Then using sqlite3 database.db I can run sqlite commands and query the tables. It works okay, but for large csv files it's a bit slow (inserting the data into the tables, that is) . Sqlite has its own command for importing csv files into a table, which looks like:

.mode csv
.import FILE_PATH TABLE_NAME

This seems to be much faster at creating the tables and inserting the data than what I have written. I was wondering if there's any way to write python code that would write the above commands directly into the sqlite command line, so it could be done for multiple csv files automatically without having to explicitly type the .import command for each one? Or something to this effect?

  • You can use python to send commands to the terminal using subprocess or os.system, see this [thread](https://stackoverflow.com/questions/3730964/python-script-execute-commands-in-terminal). This might work for you. – Alfred Rodenboog Jul 30 '21 at 15:16
  • 2
    Specifically this answer: https://stackoverflow.com/a/59671652/8451814 – Woodford Jul 30 '21 at 15:51
  • if it is a very large dataset you also have an option to use bulk copy https://pypi.org/project/bcpy/ – pyeR_biz Jul 31 '21 at 06:33
  • @Woodford yes perfect that's exactly what I was looking for, thank you :)) – bunirules Aug 02 '21 at 15:40

0 Answers0