215

How do I dump the data, and only the data, not the schema, of some SQLite3 tables of a database (not all the tables)? The dump should be in SQL format, as it should be easily re-entered into the database later and should be done from the command line. Something like

sqlite3 db .dump

but without dumping the schema and selecting which tables to dump.

galath
  • 5,717
  • 10
  • 29
  • 41
Pablo Fernandez
  • 279,434
  • 135
  • 377
  • 622

13 Answers13

236

You're not saying what you wish to do with the dumped file.

To get a CSV file (which can be imported into almost everything)

.mode csv 
-- use '.separator SOME_STRING' for something other than a comma.
.headers on 
.out file.csv 
select * from MyTable;

To get an SQL file (which can be reinserted into a different SQLite database)

.mode insert <target_table_name>
.out file.sql 
select * from MyTable;
rinogo
  • 8,491
  • 12
  • 61
  • 102
CyberFonic
  • 3,957
  • 1
  • 21
  • 21
  • Is there a way to do this programmatically using SQL statements? I can see how to do it using the interpreter, but what if I wanted to write a script? – coleifer Oct 10 '14 at 00:45
  • 4
    You can put your statements in a file (e.g. sample.txt) and then invoke it using: sqlite3 db.sq3 < sample.txt – CyberFonic Oct 13 '14 at 23:48
  • 1
    "*Or use the .once command instead of .output and output will only be redirected for the single next command before reverting to the console. Use .output with no arguments to begin writing to standard output again.*" [SQLite docs](https://sqlite.org/cli.html) – ruffin Mar 14 '18 at 18:56
193

You can do this getting difference of .schema and .dump commands. for example with grep:

sqlite3 some.db .schema > schema.sql
sqlite3 some.db .dump > dump.sql
grep -vx -f schema.sql dump.sql > data.sql

data.sql file will contain only data without schema, something like this:

BEGIN TRANSACTION;
INSERT INTO "table1" VALUES ...;
...
INSERT INTO "table2" VALUES ...;
...
COMMIT;
starball
  • 20,030
  • 7
  • 43
  • 238
jellyfish
  • 596
  • 1
  • 7
  • 9
  • 5
    @anurageldorado it's plain sql. just run `sqlite3 some.db < data.sql` – jellyfish Feb 13 '16 at 08:07
  • 1
    For some rasson not work for me. I need uses around. `sqlite3 storage/db/jobs.s3db .schema jobs > schema.sql` not work, but `echo '.schema' jobs | sqlite3 storage/db/jobs.s3db > schema.sql` work fine – abkrim Apr 27 '16 at 14:43
  • 2
    It seemed like a good solution, but in my case most of the lines are actually getting removed by grep. The .schema command generates the schema of each table on multiple lines, so there is a line containing only `);`, and the grep removes all of the lines containing `);` Adding the `-x` option to grep resolves this problem. – Sunder Jul 13 '16 at 07:49
  • Thank you very much!! – emont01 Jun 20 '23 at 16:03
44

You can specify one or more table arguments to the .dump special command, e.g.sqlite3 db ".dump 'table1' 'table2'".

Juozas Kontvainis
  • 9,461
  • 6
  • 55
  • 66
Paul Egan
  • 441
  • 5
  • 7
  • 5
    when I add multiple table names as you mentioned, it gives me this output: Usage: .dump ?--preserve-rowids? ?LIKE-PATTERN? – mwm Nov 28 '17 at 22:48
  • 1
    @mwm I'm observing the same problem in `sqlite3 3.31.1 (2020/01/27)`. The [changelog](https://sqlite.org/changes.html) says nothing about that. (By the way, `--preserve-rowids` does work but is not documented at all.) – ynn Mar 21 '20 at 16:29
38

Not the best way, but at lease does not need external tools (except grep, which is standard on *nix boxes anyway)

sqlite3 database.db3 .dump | grep '^INSERT INTO "tablename"'

but you do need to do this command for each table you are looking for though.

Note that this does not include schema.

spacediver
  • 1,483
  • 1
  • 11
  • 18
polyglot
  • 9,945
  • 10
  • 49
  • 63
  • 1
    I used `sqlite3 Database.s3db .dump` – Jader Dias Feb 14 '11 at 18:23
  • 3
    This will break if those inserts have newlines in the values. Better use `grep -v '^CREATE'` as suggested in one of the other answers – dequis Sep 22 '14 at 11:37
  • 2
    using `grep -v '^CREATE;` will break if the `CREATE` statements have line breaks in them (which they sometimes do). Best, IMO, is not to automatically strip out the `CREATE` statements at all, but manually edit them out. Just use whatever text editor you need, and search for `CREATE` and manually remove those statements. As long as the database isn't huge (and since you're using sqlite, I'd guess it's note), then this is pretty simple. – Dan Jones Oct 08 '15 at 19:51
  • but the grep of the create will also take the create from the views. how can i remove that? – Silve2611 Feb 01 '16 at 12:09
15

Any answer which suggests using grep to exclude the CREATE lines or just grab the INSERT lines from the sqlite3 $DB .dump output will fail badly. The CREATE TABLE commands list one column per line (so excluding CREATE won't get all of it), and values on the INSERT lines can have embedded newlines (so you can't grab just the INSERT lines).

for t in $(sqlite3 $DB .tables); do
    echo -e ".mode insert $t\nselect * from $t;"
done | sqlite3 $DB > backup.sql

Tested on sqlite3 version 3.6.20.

If you want to exclude certain tables you can filter them with $(sqlite $DB .tables | grep -v -e one -e two -e three), or if you want to get a specific subset replace that with one two three.

retracile
  • 12,167
  • 4
  • 35
  • 42
10

As an improvement to Paul Egan's answer, this can be accomplished as follows:

sqlite3 database.db3 '.dump "table1" "table2"' | grep '^INSERT'

--or--

sqlite3 database.db3 '.dump "table1" "table2"' | grep -v '^CREATE'

The caveat, of course, is that you have to have grep installed.

Drew
  • 6,311
  • 4
  • 44
  • 44
  • 1
    I like this one. As an added bonus, it still works if you have a dumped SQL file hanging around, just `cat database.sql | grep '^INSERT' > database_inserts.sql` (same for schema, replace with `grep '^CREATE'` – trisweb Jun 04 '12 at 17:52
  • 2
    @trisweb, of course you mean `grep '^INSERT' < database.sql > database_inserts.sql` that `cat` is superfluous – Sebastian Jul 17 '12 at 14:11
  • 1
    Nothing superfluous about it. The `cat` costs basically nothing to execute and makes the chain of input to output much clearer. Of course, you could also write `< database.sql grep '^INSERT' ...` but an explicit pipe is much easier to read. – rjh Feb 15 '16 at 14:49
  • 3
    when I add multiple table names as you mentioned, it gives me this output: Usage: .dump ?--preserve-rowids? ?LIKE-PATTERN? – mwm Nov 28 '17 at 22:48
  • -1: Searching for lines with CREATE is a useless idea. Almost every view or trigger especially, if it contains comments, requires more than one line. – ceving Dec 11 '17 at 14:26
7

In Python or Java or any high level language the .dump does not work. We need to code the conversion to CSV by hand. I give an Python example. Others, examples would be appreciated:

from os import path   
import csv 

def convert_to_csv(directory, db_name):
    conn = sqlite3.connect(path.join(directory, db_name + '.db'))
    cursor = conn.cursor()
    cursor.execute("SELECT name FROM sqlite_master WHERE type='table';")
    tables = cursor.fetchall()
    for table in tables:
        table = table[0]
        cursor.execute('SELECT * FROM ' + table)
        column_names = [column_name[0] for column_name in cursor.description]
        with open(path.join(directory, table + '.csv'), 'w') as csv_file:
            csv_writer = csv.writer(csv_file)
            csv_writer.writerow(column_names)
            while True:
                try:
                    csv_writer.writerow(cursor.fetchone())
                except csv.Error:
                    break

If you have 'panel data, in other words many individual entries with id's add this to the with look and it also dumps summary statistics:

        if 'id' in column_names:
            with open(path.join(directory, table + '_aggregate.csv'), 'w') as csv_file:
                csv_writer = csv.writer(csv_file)
                column_names.remove('id')
                column_names.remove('round')
                sum_string = ','.join('sum(%s)' % item for item in column_names)
                cursor.execute('SELECT round, ' + sum_string +' FROM ' + table + ' GROUP BY round;')
                csv_writer.writerow(['round'] + column_names)
                while True:
                    try:
                        csv_writer.writerow(cursor.fetchone())
                    except csv.Error:
                        break 
Davoud Taghawi-Nejad
  • 16,142
  • 12
  • 62
  • 82
7

Review of other possible solutions

Include only INSERTs

sqlite3 database.db3 .dump | grep '^INSERT INTO "tablename"'

Easy to implement but it will fail if any of your columns include new lines

SQLite insert mode

for t in $(sqlite3 $DB .tables); do
    echo -e ".mode insert $t\nselect * from $t;"
done | sqlite3 $DB > backup.sql

This is a nice and customizable solution, but it doesn't work if your columns have blob objects like 'Geometry' type in spatialite

Diff the dump with the schema

sqlite3 some.db .schema > schema.sql
sqlite3 some.db .dump > dump.sql
grep -v -f schema.sql dump > data.sql

Not sure why, but is not working for me

Another (new) possible solution

Probably there is not a best answer to this question, but one that is working for me is grep the inserts taking into account that be new lines in the column values with an expression like this

grep -Pzo "(?s)^INSERT.*\);[ \t]*$"

To select the tables do be dumped .dump admits a LIKE argument to match the table names, but if this is not enough probably a simple script is better option

TABLES='table1 table2 table3'

echo '' > /tmp/backup.sql
for t in $TABLES ; do
    echo -e ".dump ${t}" | sqlite3 database.db3 | grep -Pzo "(?s)^INSERT.*?\);$" >> /tmp/backup.sql
done

or, something more elaborated to respect foreign keys and encapsulate all the dump in only one transaction

TABLES='table1 table2 table3'

echo 'BEGIN TRANSACTION;' > /tmp/backup.sql
echo '' >> /tmp/backup.sql
for t in $TABLES ; do
    echo -e ".dump ${t}" | sqlite3 $1 | grep -Pzo "(?s)^INSERT.*?\);$" | grep -v -e 'PRAGMA foreign_keys=OFF;' -e 'BEGIN TRANSACTION;' -e 'COMMIT;' >> /tmp/backup.sql
done

echo '' >> /tmp/backup.sql
echo 'COMMIT;' >> /tmp/backup.sql

Take into account that the grep expression will fail if ); is a string present in any of the columns

To restore it (in a database with the tables already created)

sqlite3 -bail database.db3 < /tmp/backup.sql
Community
  • 1
  • 1
Francisco Puga
  • 23,869
  • 5
  • 48
  • 64
  • The regexp of the new solution should be ` (?s)(?<=\n|^)INSERT.*?\);[ \t]*(\n|$)`: using `*?` instead of `*` we match until we find the first `);`, otherwise we would match until the last. Then start and end should be `\n|^` and `\n|$` for `-z`. – DPD- Jul 04 '23 at 20:19
5

According to the SQLite documentation for the Command Line Shell For SQLite you can export an SQLite table (or part of a table) as CSV, simply by setting the "mode" to "csv" and then run a query to extract the desired rows of the table:

sqlite> .header on
sqlite> .mode csv
sqlite> .once c:/work/dataout.csv
sqlite> SELECT * FROM tab1;
sqlite> .exit

Then use the ".import" command to import CSV (comma separated value) data into an SQLite table:

sqlite> .mode csv
sqlite> .import C:/work/dataout.csv tab1
sqlite> .exit

Please read the further documentation about the two cases to consider: (1) Table "tab1" does not previously exist and (2) table "tab1" does already exist.

PeterCo
  • 910
  • 2
  • 20
  • 36
3

The best method would be to take the code the sqlite3 db dump would do, excluding schema parts.

Example pseudo code:

SELECT 'INSERT INTO ' || tableName || ' VALUES( ' || 
  {for each value} ' quote(' || value || ')'     (+ commas until final)
|| ')' FROM 'tableName' ORDER BY rowid DESC

See: src/shell.c:838 (for sqlite-3.5.9) for actual code

You might even just take that shell and comment out the schema parts and use that.

harningt
  • 709
  • 7
  • 19
2

This version works well with newlines inside inserts:

sqlite3 database.sqlite3 .dump | grep -v '^CREATE'

In practice excludes all the lines starting with CREATE which is less likely to contain newlines

Elia Schito
  • 989
  • 7
  • 18
0

The answer by retracile should be the closest one, yet it does not work for my case. One insert query just broke in the middle and the export just stopped. Not sure what is the reason. However It works fine during .dump.

Finally I wrote a tool for the split up the SQL generated from .dump:

https://github.com/motherapp/sqlite_sql_parser/

Walty Yeung
  • 3,396
  • 32
  • 33
-3

You could do a select on the tables inserting commas after each field to produce a csv, or use a GUI tool to return all the data and save it to a csv.