I am new to PouchDB and I am trying to simply pull data from my remote CouchDB database to a new local PouchDB database.
The following works of course:
PouchDB.replicate(remote, local, { live: false, retry: false })
but it is just crazy how slow it is! I take it is because it's doing so many http requests sequentially.
As an optimisation I thought:
OK, let's fetch the data first in one HTTP request with
remote.allDocs
then uselocal.bulkDocs
to insert the docs in the local db
but it's not equivalent to replicating the db, and the _rev
field in the docs will make bulkDocs
fail.
What should I do?
More details
about the database
- there currently are 1202 documents
- there use to be about 4000
- I just added a couple of revisions to each document
about the performances:
- it takes about 2 seconds to run
remote.allDocs({include_docs: true})
. The output is 1.15 MB - it takes 1 minute to run
PouchDB.replicate(remote, local, { live: false, retry: false })
, the output is about 5 MB - it takes about 24 seconds to add a
filter: doc => !doc._deleted
- and about 20 seconds to add a
filter: 'new_device/excludeDeletedDocs'
with the same filter as before in a design document
About the replication lasting 1 minute: it surprised me, I thought it was faster than that, but it's how long it takes now.
about the app
- there is one database per user
- they can access it through multiple devices, online and offline
- most docs are small, are edited a couple of times and are relatively short lived
- a big minority of them last virtually forever
- a fraction of them can be bigger and can be edited many many times.