0

I build a django based geolocation service which determines user's location by their IP address. The first thing I need to do is to insert ip data into my database.

I used following code(simplified) to insert records into my db:

 for ipLoc in ipSeeker.ipLocationList:
     placeName =ipLoc.country + ipLoc.area
     IPLog.objects.create(
                startIP = int_to_dqn(ipLoc.startIP), 
                endIP = int_to_dqn(ipLoc.endIP), 
                place = placeName
     ).save()

The ipLocationList has approx 400k ip records. And my script only insert 20k records in 20 minutes. It is tooooooo slow and could not be accepted.

So my question is: where is the bottleneck and how could I make it faster?

Thanks in advance!

xiao 啸
  • 6,350
  • 9
  • 40
  • 51
  • See my answer on this page: http://stackoverflow.com/questions/7019831/bulk-batch-update-upsert-in-postgresql/7020219#7020219. – atrain Aug 18 '11 at 13:07

2 Answers2

1

Use raw sql and transactions for django-side: https://docs.djangoproject.com/en/dev/topics/db/sql/ along with the Copy command from Postgres!

Whats the fastest way to do a bulk insert into Postgres?

Community
  • 1
  • 1
StefanNch
  • 2,569
  • 24
  • 31
1

I found a complete tutorial about how to load GEOIP database into postgresSQL in the siafoo. It is well written and covers most detail. The link is http://www.siafoo.net/article/53

xiao 啸
  • 6,350
  • 9
  • 40
  • 51