1

I'm currently working with sqlite 3, and I need to handle nearly 100M inserts... However the insert becomes extremely slow later on, perhaps about 1 every 30 seconds...

Is there any other db module that can be used in Python that handles insert quickly even with large data?

I'm very frustrated over this speed issue... Please help me out and I will greatly appreciate it.

ytrewq
  • 3,670
  • 9
  • 42
  • 71
  • 2
    Have you read [this](http://stackoverflow.com/questions/1711631/how-do-i-improve-the-performance-of-sqlite) yet? Another good overview [here](http://codereview.stackexchange.com/questions/26822/myth-busting-sqlite3-performance-w-pysqlite). – John Jun 24 '13 at 23:45
  • 1
    Bit vague.... What kind of data - how is to be accessed later - what kind of queries are going to happen? Is SQL required - is a NoSQL or KeyValue store okay... Anyway, ideally you should give an idea of the data and how you're currently doing your insert into sqlite3 – Jon Clements Jun 24 '13 at 23:45
  • @JonClements I've tried to do it internally in Python using list with no sql. Because my data is large, later it becomes a memory issue.. – ytrewq Jun 24 '13 at 23:48
  • @CosmicRabbitMediaInc and how do you expect to be able to retrieve the data later? – Jon Clements Jun 24 '13 at 23:50
  • In the mean time - where's your existing code - for instance, are you using multiple `insert` queries, or are you making use of `executemany` or similar? – Jon Clements Jun 24 '13 at 23:53

1 Answers1

0

For this particular use-case, SQLite may not be the best database. I suggest the following tools (from higher to lower level):

Paulo Scardine
  • 73,447
  • 11
  • 124
  • 153