I'm going to use SQLite in order to save a lot of data in real-time environment.
In order to avoid procedures of find disk space (or move pages in the DB file) for new data to be written to the DB in real-time, I want to build tables in advance and insert into them the largest data that any cell can has (according to its type), So in the real-time running, there will be only 'UPDATES' queries.
The building and inserting data made in journal_mode=WAL
mode.
I have 6 different DB files that i have to build. Every DB has between 10 to 200 tables, where all the tables in all the DB look the same :
ID | TimeStart | Float data | Float data | Float data
--------------------------------------------------------------------------------
The difference is that there are some tables with 100000 rows and some with 500000 rows.
These DBs are built on a SD card with an ARM9 CPU (on linux), so it takes a lot of time to build the DBs. I am talking about some days.
How can i speed-up the building? are there any 'Pragmas' or tricks that i can do? Can i copy a ready-table?
It is important to mention that the robust of the DB is not important in the building process - Speed is much more important to me than the corruption possibility of the DB.