0

I had problem with multiple insert in sqlite database when my table has been growing up. with 200 insert in table I've face to freezing UI even I've wrote my insert query in back thread. ( might be synchronized method cause this , database is busy and reading must be lock until inserting has been finished) I see This Link that is very helpful to understand many option that i can customize in database. but i can't use any one of those except SYNCHRONIZE. I've think to use Prepared Statement but as i use content provider and i see source code, content provider use that in default. following code is from source:

public long insertWithOnConflict(String table, String nullColumnHack,
            ContentValues initialValues, int conflictAlgorithm) {
        acquireReference();
        try {
            StringBuilder sql = new StringBuilder();
            sql.append("INSERT");
            sql.append(CONFLICT_VALUES[conflictAlgorithm]);
            sql.append(" INTO ");
            sql.append(table);
            sql.append('(');

            Object[] bindArgs = null;
            int size = (initialValues != null && initialValues.size() > 0)
                    ? initialValues.size() : 0;
            if (size > 0) {
                bindArgs = new Object[size];
                int i = 0;
                for (String colName : initialValues.keySet()) {
                    sql.append((i > 0) ? "," : "");
                    sql.append(colName);
                    bindArgs[i++] = initialValues.get(colName);
                }
                sql.append(')');
                sql.append(" VALUES (");
                for (i = 0; i < size; i++) {
                    sql.append((i > 0) ? ",?" : "?");
                }
            } else {
                sql.append(nullColumnHack + ") VALUES (NULL");
            }
            sql.append(')');

            SQLiteStatement statement = new SQLiteStatement(this, sql.toString(), bindArgs);
            try {
                return statement.executeInsert();
            } finally {
                statement.close();
            }
        } finally {
            releaseReference();
        }
    }

tell me if it's not right.

i can't use Transaction in this case because i don't know when incoming messages has been end. and i can't wait or inserting with delay, to get next message

i can't use journal_mode = MEMORY either because it's risky.

is there any solution to optimize insert ? my code is not important, it's simple insert but in big table. ( in column number and row number ) if you want to see my code tell me, i will post that

Community
  • 1
  • 1
Shayan Pourvatan
  • 11,898
  • 4
  • 42
  • 63
  • inserting in 200 tables at once ?? – Khushal Chouhan Jun 06 '15 at 08:37
  • @KhushalChouhan no, 200 row insert in one table – Shayan Pourvatan Jun 06 '15 at 08:38
  • oh then you can do it as background service. Check Asynctask, if you feel its right for your solution, I will give you sample code – Khushal Chouhan Jun 06 '15 at 08:41
  • @KhushalChouhan i've insert in backThread as i mention in my question – Shayan Pourvatan Jun 06 '15 at 08:42
  • What are you using in it to create backthread? Asynctask, Service, Normal thread, AsyncLoader? – Khushal Chouhan Jun 06 '15 at 08:48
  • @KhushalChouhan normal thread – Shayan Pourvatan Jun 06 '15 at 08:56
  • use AsyncTask , its more efficient than normal thread – Khushal Chouhan Jun 06 '15 at 08:58
  • @KhushalChouhan might be efficient than normal thread, but my question is something else. 200 insert not take too long time, – Shayan Pourvatan Jun 06 '15 at 09:06
  • Problem is, you're also doing it as separate transactions... each time you open the DB and close it, that's a whole lot of precious time. And if you really can't wait to batch the inserts, you're SOL - there's no way to minimize the time needed to open and close the database. Perhaps you can find a middle ground - get 10 responses, execute a transaction, and get more responses in the back? – Zadrox Jun 06 '15 at 09:08
  • i can't change thread to `Asynctask` too, because I've use smack library and one method in that library return my data, so if i want to use asyncTask, i need create `AsyncTask` in thread, so i think this is redundant – Shayan Pourvatan Jun 06 '15 at 09:08
  • @Acco i can't, my app is messaging application, i don't know number of offline message that i must to get. – Shayan Pourvatan Jun 06 '15 at 09:09
  • I think you need to rethink how you're processing incoming messages, even if they're offline ones. Cause your clients aren't going to enjoy a laggy experience if you're having issues dealing with retrieving 100-200 messages at once. I'd consider using some sort of thread safe queue where you can pull multiple messages off the front of the queue and do a transaction on them while writing incoming messages. The better way to do it is via GCM though, if that's a possibility. – Zadrox Jun 06 '15 at 09:18
  • @Acco i think about that too, but i can't save my messages in memory, because if messages not insert to DB i have message lost, so i must persist in one place ( file or db ). i think about appending to file and insert with transaction but i don't think that be useful, i search about circular buffer that insert to the last and read from first, but don't get any solution. – Shayan Pourvatan Jun 06 '15 at 09:25
  • @Acco how `GCM` can help me ? – Shayan Pourvatan Jun 06 '15 at 09:26
  • Actually, GCM can't. Just thought of the use case where it fails. – Zadrox Jun 06 '15 at 09:34

2 Answers2

0

Use AsyncTask class to do any background job. It does the job in a background thread and your UI becomes available. Insert in the table by overriding doInBackground method of AsyncTask class and if you want to update the UI after inserting override onPostExecute method. Read AsyncTask for more details.

Saptak Niyogi
  • 240
  • 2
  • 10
0

You must use a transaction for as many inserts as possible.

There is a simple method to automatically merge operations into relatively few transactions:

  1. Put incoming operations into a queue.
  2. As long as the queue is not empty, execute the operations inside a single transaction.
  3. When the queue becomes empty, commit the transaction. (You could also use some upper limit on the number of operations in a single transaction.)

When you get few incoming messages, each one will get its own transaction. When you get many incoming messages, this algorithm will generate large transactions.

CL.
  • 173,858
  • 17
  • 217
  • 259
  • Thanks for your answer. i had a problem with storing messages in memory, because if application has been killed my messages has been lost, so i lost my messages, i've use stream management for telling to server that message arrived and i ca't override that to sending stream ack after inserting to db. – Shayan Pourvatan Jun 06 '15 at 11:19