The Setup
I have native iOS and Android apps which sync data to and from my webserver. A requirement of the apps is that they work offline so data is stored on the apps in sqlite databases.
The apps communicate with the server with a series of REST calls which send JSON from the server for the apps to store in their databases.
My Problem
The scale of this data is very large, some tables can have a million records, and the final size of the phone databases can approach 100mb.
The REST endpoints must limit their data and have to be called many times with different offsets for a whole sync to be achieved.
So I'm looking for ways to improve the efficiency of this process.
My Idea
An idea I had was to create a script which would run on the server which would create an sqlite file from the servers database, compress it and put it somewhere for the apps to download. Effectively creating a snapshot of the server's current data.
The apps would download this snapshot but still have to call their REST methods in case something had changed since the snapshot happened.
The Question
This would add another level of complexity to my webapp and I'm wondering if this is the right approach. Are there other techniques that people use when syncing large amounts of data?