0

I am trying to insert around 500 records into the sqlite database, it is taking 20 sec, which is way too much! Could anyone please tell how to optimize this.

for (int i = 0 ; i< [[[dic objectForKey:@"response" ]objectForKey:@"requete"]count]; i++) {
        [sqlManager executeQuery:[[[dic  objectForKey:@"response"]objectForKey:@"requetes"]objectAtIndex:i]];
    }

with

-(void)executeQuery:(NSString*)_query
{
sqlite3 *database;
if(sqlite3_open([databasePath UTF8String], &database) == SQLITE_OK) {

    if(sqlite3_exec(database, [_query UTF8String],NULL, NULL, NULL) == SQLITE_OK){
        //NSLog(@"Query Success");
    }
    else{
       // NSLog(@"Query Failed");
    }
    sqlite3_close(database);
}
sqlite3_close(database);
}
Nabil El
  • 972
  • 1
  • 10
  • 24
  • 2
    Open only once. And then start a transaction if you're doing multiple inserts. – Hot Licks Jan 21 '14 at 12:28
  • just a suggestion use FMDB https://github.com/ccgus/fmdb you can use transaction then without any worry of rollback – amar Jan 21 '14 at 12:31
  • 2
    Note that **there is no need to open/close on every operation**. SQLite is quite robust in terms of preserving data if the app crashes with the DB open. Once you've done "commit" (which occurs implicitly after each op unless you start a transaction), the data is guaranteed to be written to disk. – Hot Licks Jan 21 '14 at 12:39

3 Answers3

1

You are about to Open & closing the database for every Query(insertion) . So it takes time too.

example For inserting a multiple records at once.

Community
  • 1
  • 1
Kumar KL
  • 15,315
  • 9
  • 38
  • 60
1
  1. Open once and leave open until the app is closed or you're done with a DB-intense phase. SQLite can easily handle the situation if your app "crashes" with the DB open.
  2. When doing multiple inserts close together, where it's fine if they either all go or none of them go, start an transaction (and commit it when done). When you do not explicitly start a transaction then SQLite does an implicit start/commit around each insert or update, "forcing" the data to "disk". This takes A LOT of time. Doing one start/commit around a bunch of inserts will increase insert performance at least 10-fold.
Hot Licks
  • 47,103
  • 17
  • 93
  • 151
  • I used your following solution it's work for me but I do one more thing in the background for an update some records which are coming from the server and it's `crash`. can you please give me a solution.? – Ilesh P Oct 13 '17 at 10:56
  • @ilesh - If you're doing stuff "in the background" you many not be serializing properly. – Hot Licks Oct 13 '17 at 12:13
0

I think you must mantein your database open.

What takes more time is to open and close database.

For example, create a method only for that insertion, that opens the database in the firts element and closes it after the last.

Miguel Chaves
  • 139
  • 1
  • 9