0

I try to create a ring buffer, so that i can store a lot of json-Data.

The goal is to save around 300,000 records and change them cyclically. For the test, I randomly created 1,000 records (with 10 float values per record) and saved them as json in the indexedDB.

To persist in the IndexedDB, i used a loop (from 0 to 99) and the command "put".

My observation is the following:

  • On the first pass, the DB is created and the 100 records are saved successfully.
  • Also, the first refresh works, the new randomly generated float values are saved. But the memory utilization increases significantly.
  • After a second refresh, the random data will not be changed, because the memory usage has exceeded the limit.

The key for the used indexedDB are set in a loop (starts by 0 and ends by 99).

In other browsers like Firefox and MS Edge, the test runs well, even after 100 refreshes.

Is there someone who knows the problem or even better has a solution?

It would also be ok to delete all Recrords from the indexedDB, while the page is reloading. So I tried to remove all data while initializing - but also here, the memory usage stayed at a high level. (over 230 MB).

function getObjectStore(store_name, mode) {
  var tx = db.transaction(store_name, mode);
  return tx.objectStore(store_name);
}

function putDbElement(number, json, _callback) {
    var obj = {
        number: number,
        json: json
    };

    var store = getObjectStore(DB_STORE_NAME, 'readwrite');
    var req;
    try {
        req = store.put(obj);
        _callback();
    } catch (e) {
        throw e;
    }
}



  for ( var i = 0; i < 100; i++ ) {
    putDbElement(
      i,
      getRandomJson( 1000 ),
      function() {
        console.log( "created: " + i );
      }
    );
  }
David
  • 1
  • 2
  • Can you please clarify why you are using many transactions instead of one transaction? – Josh Oct 22 '18 at 14:34

2 Answers2

0

IndexedDB is asynchronous. You are opening a new transaction for each iteration. It could be the reason for high memory usage. You need to handle success, error. You can use loops, but they must be within a transaction, onsuccess. Then each put operation must have its own success/error handlers too.

dp2050
  • 332
  • 3
  • 8
0

Thankes alot for your quick answer.

I've extended the code to onsuccess and onerror.But still had the same problem.

Although I found no solution, but an explanation for the problem: IndexedDB size keeps growing even though data saved doesn't change.

Chrome uses the LevelDB to be faster, but in my case I find that irritating.

David
  • 1
  • 2