50

I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post about using your own database, but I need this data to go into my app's standard Android database. Note that this would only be done once per device.

Some ideas:

  1. Put a bunch of SQL statements in a file, read them in a line at a time, and exec the SQL.

  2. Put the data in a CSV file, or JSON, or YAML, or XML, or whatever. Read a line at a time and do db.insert().

  3. Figure out how to do an import and do a single import of the entire file.

  4. Make a sqlite database containing all the records, copy that onto the Android device, and somehow merge the two databases.

  5. [EDIT] Put all the SQL statements in a single file in res/values as one big string. Then read them a line at a time and exec the SQL.

What's the best way? Are there other ways to load data? Are 3 and 4 even possible?

Community
  • 1
  • 1
Ron Romero
  • 9,211
  • 8
  • 43
  • 64

5 Answers5

98

Normally, each time db.insert() is used, SQLite creates a transaction (and resulting journal file in the filesystem), which slows things down.

If you use db.beginTransaction() and db.endTransaction() SQLite creates only a single journal file on the filesystem and then commits all the inserts at the same time, dramatically speeding things up.

Here is some pseudo code from: Batch insert to SQLite database on Android

try
{
  db.beginTransaction();

  for each record in the list
  {
    do_some_processing();

    if (line represent a valid entry)
    {
      db.insert(SOME_TABLE, null, SOME_VALUE);
    }

    some_other_processing();
  }

  db.setTransactionSuccessful();
}
catch (SQLException e) {}
finally
{
  db.endTransaction();
}

If you wish to abort a transaction due to an unexpected error or something, simply db.endTransaction() without first setting the transaction as successful (db.setTransactionSuccessful()).

Another useful method is to use db.inTransaction() (returns true or false) to determine if you are currently in the middle of a transaction.

Documentation here

Jake Wilson
  • 88,616
  • 93
  • 252
  • 370
  • 2
    For more Code Reference: [Batch insert to SQLite database on Android](http://notes.theorbis.net/2010/02/batch-insert-to-sqlite-on-android.html) – Paresh Mayani May 31 '12 at 10:54
  • 1
    Tried to fix typo in `db.endTransaction()` but it is too small an edit – acw May 17 '13 at 23:44
  • 2
    This is exactly what I was looking for. I was importing about 50 csv files with thousands of records and it would take several hours to complete when done by reading a line from csv and inserting it in the database. With db.beginTransaction() it only took a few seconds! Thanks! – Luis Oct 04 '13 at 14:32
  • 1
    Rolled back to r4. r5 and r6 were unnecessary - the typo was already fixed. – nobody Apr 16 '14 at 16:41
  • @AndrewMedico Yes, per the linked example and the documentation, `db.setTransactionSuccessful()` must be called. `db.endTransaction()` is already called in the finally block. – JasonMArcher Apr 16 '14 at 16:48
  • database got locked – Karue Benson Karue Jul 27 '17 at 02:52
34

I've found that for bulk insertions, the (apparently little-used) DatabaseUtils.InsertHelper class is several times faster than using SQLiteDatabase.insert.

Two other optimizations also helped with my app's performance, though they may not be appropriate in all cases:

  • Don't bind values that are empty or null.
  • If you can be certain that it's safe to do it, temporarily turning off the database's internal locking can also help performance.

I have a blog post with more details.

Dan Breslau
  • 11,472
  • 2
  • 35
  • 44
  • Thanks! This optimization increased importing of a bunch of records by 2x. – Steve Pomeroy Oct 29 '12 at 04:22
  • I read the blog, and the comments. Both provided details enough insert all my data within a couple of seconds. Many times better than several minutes. – Knossos Nov 08 '12 at 10:36
  • 4
    Now that `InsertHelper` is deprecated, we are forced to use `SQLiteStatement` ? – Marek Sebera Jan 15 '13 at 12:26
  • Until now, I hadn't realized that `InsertHelper` is deprecated -- but I'm not surprised. `SQLiteStatement` may be as good a solution as any. I'll try to look into this some more in my Copious Spare Time™ :-) – Dan Breslau Jan 18 '13 at 21:15
  • 6
    For the record: This class was deprecated in API level 17. Use SQLiteStatement instead. – Martin Marconcini Aug 06 '13 at 19:24
  • 2
    [Bulk insert with `SQLiteStatement`](http://stackoverflow.com/a/14344928/1276636) – Sufian Jan 11 '15 at 03:39
10

This example below will work perfectly

 String sql = "INSERT INTO " + DatabaseHelper.TABLE_PRODUCT_LIST
                + " VALUES (?,?,?,?,?,?,?,?,?);";

        SQLiteDatabase db = this.getWritableDatabase();
        SQLiteStatement statement = db.compileStatement(sql);
        db.beginTransaction();
        for(int idx=0; idx < Produc_List.size(); idx++) {
            statement.clearBindings();
            statement.bindLong(1, Produc_List.get(idx).getProduct_id());
            statement.bindLong(2,  Produc_List.get(idx).getCategory_id());
            statement.bindString(3, Produc_List.get(idx).getName());
//            statement.bindString(4, Produc_List.get(idx).getBrand());
            statement.bindString(5, Produc_List.get(idx).getPrice());
            //statement.bindString(6, Produc_List.get(idx).getDiscPrice());
            statement.bindString(7, Produc_List.get(idx).getImage());
            statement.bindLong(8, Produc_List.get(idx).getLanguage_id());
            statement.bindLong(9, Produc_List.get(idx).getPl_rank());
            statement.execute();

        }
        db.setTransactionSuccessful();
        db.endTransaction();
arun-r
  • 3,104
  • 2
  • 22
  • 20
9

Well, my solution for this it kind of weird but works fine... I compile a large sum of data and insert it in one go (bulk insert?)

I use the db.execSQL(Query) command and I build the "Query" with the following statement...

INSERT INTO yourtable SELECT * FROM (
    SELECT 'data1','data2'.... UNION
    SELECT 'data1','data2'.... UNION
    SELECT 'data1','data2'.... UNION
    .
    .
    .
    SELECT 'data1','data2'....
)

The only problem is the building of the query which can be kind of messy. I hope it helps

Wand Maker
  • 18,476
  • 8
  • 53
  • 87
Andrew Prat
  • 91
  • 1
  • 1
8

I don't believe there is any feasible way to accomplish #3 or #4 on your list.

Of the other solutions you list two that have the datafile contain direct SQL, and the other has the data in a non-SQL format.

All three would work just fine, but the latter suggestion of grabbing the data from a formatted file and building the SQL yourself seems the cleanest. If true batch update capability is added at a later date your datafile is still usable, or at least easily processable into a usable form. Also, creation of the datafile is more straightforward and less error prone. Finally, having the "raw" data would allow import into other data-store formats.

In any case, you should (as you mentioned) wrap the groups of inserts into transactions to avoid the per-row transaction journal creation.

codelark
  • 12,254
  • 1
  • 45
  • 49