0

I am in a situation where I would have to insert multiple records into a postgre db through an ajax call, based on a foreignkey.

Currently I am using db1.db2_set.create(...) for each record, looping over a list of dictionaries.

Is this the best way to do it? It seems like I'm hitting the database for every insert.

bash-
  • 6,144
  • 10
  • 43
  • 51
  • http://stackoverflow.com/questions/4294088/accelerate-bulk-insert-using-djangos-orm – Timmy O'Mahony Oct 08 '11 at 12:55
  • You don't hit the DB until you call `.save`. Are you doing that within the loop, or can you wait until the loop completes? If you can wait, Django will update the DB in a single transaction. – ire_and_curses Oct 08 '11 at 13:00
  • actually the create does hit the database, I dont need to call `.save()` for it to commit to the db – bash- Oct 09 '11 at 16:36

1 Answers1

0

I'm pretty sure that django will query database when you call save() method. So if you do something like:

for i in objects:
    db1.db2_set.create(i)

db1.save()

It may acess database only once. But still, this might be usefull: http://djangosnippets.org/snippets/766/

It's a middleware that you can add to see how many querys your django application uses in the every page you are accessing.

Guilherme David da Costa
  • 2,318
  • 4
  • 32
  • 46