0

How can I make Pandas take a .sql file and turn it into a DataFrame, to then be exported to .csv?

I've searched online and found something like this:

import pandas as pd
import sqlite3

con = sqlite3.connect('path/to/file.sql')
data = pd.read_sql(
    'SELECT * FROM tablename', con
)
data.to_csv('path/to/file.csv')
con.close()

I've tried that (with changed file paths of course), but it throws a DatabaseError, saying that file is not a database. I am unsure what this means.

My .sql file looks something like this, though much longer:

CREATE TABLE tablename(
    one INTEGER,
    two TEXT PRIMARY KEY
);
INSERT INTO tablename(one, two) VALUES (1, 'hello world');
SELECT * FROM tablename -- I have tried with and without the SELECT statement

I can confirm that it is a valid SQL file.

Thank you in advance.

silvncr
  • 35
  • 7
  • `sqlite3.connect`expects an SQLite database file, what you are giving it is a file of SQL(ite) commands. You can use your .sql file to populate the sqlite db and then read from it. I don't think pandas can read and execute sql commands by itself. – Yaroslav Fyodorov Mar 08 '22 at 10:02
  • How do I get a db file? – silvncr Mar 08 '22 at 10:04
  • You can create it with `sqlite` tool or any sqlite GUI client and then import your `sql` file. I guess you can do it using python sqlite3 client as well. See here https://stackoverflow.com/questions/2049109/how-do-i-import-sql-files-into-sqlite-3 – Yaroslav Fyodorov Mar 08 '22 at 10:08

0 Answers0