0

I have a Sqlite database that I'm using Ruby's Sequel to insert data into.

When attempting to insert 1M+ items it seems to be quite slow.

The code I'm using to do this is below.

DB[:my_table].multi_insert to_insert

Are there any faster ways to do this? Am I missing something?

Nathan McCallum
  • 237
  • 1
  • 4
  • 18

1 Answers1

0

I found that I got better performance wrapping the code in a transaction.

DB.transaction do
    DB[:my_table].multi_insert to_insert
end

I'm not satisfied that this is as fast as it can be, but it runs in just over 2 mins now, which should be Ok. Oh well, case closed.

Nathan McCallum
  • 237
  • 1
  • 4
  • 18
  • 1
    You should definitely try `PRAGMA synchronous=off` and `PRAGMA journal_mode = MEMORY` in addition to that. Also, if your table has an index, consider removing it and recreating it *after* the bulk insert. You can read more about SQLite tuning in [the FAQ](http://web.utk.edu/~jplyon/sqlite/SQLite_optimization_FAQ.html) – Niklas B. May 11 '14 at 12:07