4

Ok so I have a rails application called HashtagMapper.com which finds tweets by the thousands and maps them. The problem I am having is that between geocoding 1000+ queries can take up to 10 seconds, and then the .create() call takes about 10ms, which for 1000 queries is 10 seconds.

How can I create 1000+ DB objects in rails all at once? Right now I am calling

SearchResult.create([ { :2D => array }, { :of => SearchResults } ]) 

For the 1000+ objects. Would plain SQL be faster? How?

Chris Travers
  • 25,424
  • 6
  • 65
  • 182
OneChillDude
  • 7,856
  • 10
  • 40
  • 79
  • [This question and its responses](http://stackoverflow.com/questions/15317837/bulk-insert-records-into-active-record-table) may offer some insights. – eebbesen Nov 30 '13 at 18:31
  • That's interesting, but I would prefer to not convert to CSV to import records. – OneChillDude Nov 30 '13 at 19:31
  • Of course you wouldn't want to convert your objects to csv format -- `CSV` is only the collection of objects over which the enumeration is occurring. The two answers offer different ways to approach your question. Maybe neither is what you're looking for, but neither forces you to convert anything to csv. – eebbesen Nov 30 '13 at 20:59
  • Of course. I missed that the first look-through. Not enough coffee – OneChillDude Dec 02 '13 at 17:11

1 Answers1

0

Using the BulkInsert gem.

From arrays or hashes:

Book.bulk_insert(:title, :author) do |worker|
  # specify a row as an array of values...
  worker.add ["Eye of the World", "Robert Jordan"]

  # or as a hash
  worker.add title: "Lord of Light", author: "Roger Zelazny"
end

Defaults to database columns

class Book < ActiveRecord::Base
end

book_attrs = ... # some array of hashes, for instance
Book.bulk_insert do |worker|
  book_attrs.each do |attrs|
    worker.add(attrs)
  end
end
toobulkeh
  • 1,618
  • 1
  • 14
  • 22