2

I'm experiencing very bad performance and crashes with Linq2indexedDB when bulk inserting this way:

for(var i=0;i<clients.length;i++) {
    db.from(config.objectStoreName).insert(clients[i]).then(function(args){
        deferred.resolve(clients.length);
    }, function(args){
        deferred.reject("Client item couldn't be added!");
    }); 
}

When doing something very similar with the native indexedDB, it's working fine:

var store = db.transaction(["client"], "readwrite").objectStore("client");
for(var i=0;i<clients.length;i++) {
    var request = store.put(clients[i]);  
}

request.onsuccess = function() {
    deferred.resolve(clients.length);
}

request.onerror = function(e) {
    deferred.reject("Client item couldn't be added!");
}

When the array of "clients" doesn't go above a few 1000, it's ok, but by 50000 it's hanging and then the tab crashes. On the native implementation, it only takes a few seconds to upsert the 50000 (on Chrome).

Am I missing anything, i.e. is there another method to batch insert records with Linq2indexedDB or is Linq2indexedDB simply not working with batch insert/update?

Christophe Vidal
  • 1,894
  • 1
  • 19
  • 13
  • This is unfortunately at least in part due to IDB limitations around sustained writes. I discuss some reasons [here](http://stackoverflow.com/a/22315353/317937) why you'd want to re-architect away from your current approach. – buley Mar 20 '14 at 19:21
  • But the native IDB is working great for inserting, Linq2indexedDB is the problem. But on the other hand, I'm struggling with the native implementation for making complex query (kind of "select field1, field2, field3 from table where field1 > 10 and field2 = 'bla' and field3>0 and field3<100"). Experiencing with the native implementation same issue than the OP here: http://stackoverflow.com/questions/21731347/indexeddb-idbkeyrange-compound-multiple-index-not-working – Christophe Vidal Mar 20 '14 at 19:48
  • Would seem like a memory issue, although you'd think most objects would be Garbage Collected after 50k writes. I'd be happy to help debug if you've got a test page by chance. – buley Mar 20 '14 at 19:58
  • 1
    I have simplified the site and you can find an example here: http://pearsonpoc.force.com/#/select-client. The problem appears in the first batch, it can't complete the first 50k. In order to start the process, you need to make a "pull to refresh" – Christophe Vidal Mar 20 '14 at 20:41

1 Answers1

1

Thank you for trying the Linq2indexedDB library. This is still a work in progress, so feedback like this is mutch appriciated. So if you have other feedback, just let me know and I will see what I can do for you.

You are correct the DbContext doesn't provide a bulk insert for now. I will look into providing this functionality.

I think I know why you have the performance issue. The way it currently works now is that a the connection is created and closed for every insert you do. This is a choice I made in the lib. I wanted to be sure to work on the latest db everytime I connect. I'm thinking about changing this and make it possible to cache the connection, this will definitly improve the performance.

Other things that can affect the performance: - Having debugging enabled (writes log information to the console) - The viewer inside the library. For every insert an update is sent to this object.

As you mentioned for now you can use the native solution, or you can take advantage of my wrapper inside the library.

var dbpromise = linq2indexedDB.core.db("name", 1);
var transactionPromise = linq2indexedDB.core.transaction(dbpromise, ["objectstore"]);
var objectStorePromise = linq2indexedDB.core.objectStore(transactionPromise, "objectstore");

linq2indexedDB.core.insert(objectStorePromise, {}).then(success, error);
linq2indexedDB.core.insert(objectStorePromise, {}).then(success, error);
linq2indexedDB.core.insert(objectStorePromise, {}).then(success, error);
...
linq2indexedDB.core.insert(objectStorePromise, {}).then(success, error);

function success(args){
    var data = args[0]; 
    var primaryKey= args[1]; 
    var transaction= args[2];
    var orignalevent = args[3];
}

function error(args){
   var error= args;
}

transactionPromise.then(function () { 
    // Transaction completed
    // Bulk insert done.
});
Kristof Degrave
  • 4,142
  • 22
  • 32
  • the wrapper is as slow as the other method to insert data. But yes thanks for the explanation, I'm trying now to combine both approaches, i.e. I use the native method to insert data, and your plugin to retrieve them. That's going a bit out of topic now, but is there any way to limit the number of results? When the query is very selective, it goes quite fast, but when there are thousands of records which match, the query is really slow (I don't care about ordering hence the first records would be good enough). (I didn't get it working with the abort()) – Christophe Vidal Mar 21 '14 at 18:05
  • For now, you can't limit te results. The thing is when you filter on multiple fields, there is only one used on tho pass to the indexeddb API, all others are filtered inside a worker thread. Probably that is the factor that slows down. But about how many records are you trying to add. I'll try to simulate it and look what is the factor slowing it down. – Kristof Degrave Mar 21 '14 at 20:03
  • I did some tests with 50K records in the DB. Well actually it's the opposite of what you say, the finer the search, the faster it is. It's a "real-time" search with reacts on "keypress up" hence when the user enters a "a" for example, it searches through the whole DB and returns thousands of results. When the search is "finer" it gets faster. Btw, your implementation of Worker is very efficient on Chrome, but unfortunately much less on Firefox (but it might be related to the browser, more than to your code). Firefox is very slow too when it's about searching on just 1 key – Christophe Vidal Mar 22 '14 at 16:39
  • Are you using the progress callback on the select or the complete to show the results? – Kristof Degrave Mar 22 '14 at 19:17
  • yes it's exactly what I do. And to stop the execution, I throw now an exception in the progress after X results (I use the success callback too in case I get less than X results in the progress). It goes faster but is very dirty. Any better solution? :) (as said before, I can't use the abort() here, it just doesn't work and tell me the transaction is already finished) – Christophe Vidal Mar 23 '14 at 11:16
  • Yesterday I added both the batch insert functionality and the limit. Only I'm looking for a clean way to abort the cursor. – Kristof Degrave Mar 24 '14 at 19:35