2

I have 10000 records to insert in SQLite database Using Multithreading.

private void btnThread_Click(object sender, EventArgs e)
        {
                Thread th1 = new Thread(new ThreadStart(Save1));
                Thread th2 = new Thread(new ThreadStart(Save2));
                Thread th3 = new Thread(new ThreadStart(Save3));
                Thread th4 = new Thread(new ThreadStart(Save4));
                Thread th5 = new Thread(new ThreadStart(Save5));

                Thread th6 = new Thread(new ThreadStart(Save6));
                Thread th7 = new Thread(new ThreadStart(Save7));
                Thread th8 = new Thread(new ThreadStart(Save8));
                Thread th9 = new Thread(new ThreadStart(Save9));
                Thread th10 = new Thread(new ThreadStart(Save10));

                th1.Start();
                th2.Start();
                th3.Start();
                th4.Start();
                th5.Start();

                th6.Start();
                th7.Start();
                th8.Start();
                th9.Start();
                th10.Start();
      }

At above code In each thread call a function to save record like below

private void Save1()
        {
            for(int i = 0; i < 1000; i++)
            {

                using(SQLiteConnection sqliteConn = new SQLiteConnection("Data Source='" + dbPath + "'"))
                {
                    sqliteConn.Open();
                    string date = DateTime.Now.ToString();

                    string sqlInsert = "insert into PatientDetail (Name, Age, Date, PhoneNumber, Email, PatientSex, Status ) values ('Patient Name1', 35,'" + date + "','9856235674','patient@gmail.com','M',1)";
                    SQLiteCommand command = new SQLiteCommand(sqlInsert, sqliteConn);
                    command.ExecuteNonQuery();
                    sqliteConn.Close();
                }
            }
        }

At above logic record inserted in database properly but it was take >=25 minute to insert 10000 records.

When I checked around 300 to 400 record insert in one minute.

I also use Transaction to insert record but no performance improve

Is there any way by which I can able to improve performance ?

How SQLite work internally in Multithreading?

PNG
  • 103
  • 1
  • 8
  • Did you check this: https://stackoverflow.com/questions/3852068/sqlite-insert-very-slow ? – Piotr Aug 03 '17 at 11:01
  • 1
    You should perform all 1000 inserts in a single connection and single transaction, for transactions to have any effect. Is there a reason you're doing this in multiple threads? It's probably faster to insert all 10000 records in a single transaction. – C.Evenhuis Aug 03 '17 at 11:03
  • Have you compared the time this takes with the time it takes doing this on a single thread? Concurrency wise SQLite is multiple-read but single-write. – Alex K. Aug 03 '17 at 11:05
  • SQLite is not good at doing multithread operations this should never be done. Have a look at this [talk](https://www.youtube.com/watch?v=j7WnQhwBwqA&t=2434s). – Filip Cordas Aug 03 '17 at 11:25
  • @Piotr yes, by using Wrap BEGIN \ END statements insert batch of record .But i want to insert one by one record using multithreading. – PNG Aug 03 '17 at 11:26
  • 1
    Why do you want to use multithreading? What is the actual problem you're trying to solve? – CL. Aug 03 '17 at 11:40

1 Answers1

1

This article on SQLite.org describes SQLite's thread safety mechanisms and modes. By default, SQLite makes sure to serialize all operations (serialize as in not do more than one thing at once, not as in serializing objects), so it is thread safe, but this mode can be disengaged.

Inserting 1000 records is best done within a single transaction, and even if you don't do it within a single transaction, it will probably be faster by not opening a new connection each time. But if your code attempts to simulate the performance of inserting 1000 records one by one from separate pieces of code and not batching them, the performance you get in your test is representative of that.

Having ten threads compete to do this insertion is part of the problem but not the whole problem. If you have 10 threads each inserting 1000 inserts one by one, that's 10000 operations that SQLite has to serialize. Introducing some kind of batching would solve that, and would also let you use transactions which would be a huge performance boost on its own.

Jesper
  • 7,477
  • 4
  • 40
  • 57
  • Thanks, I use transaction and it give huge performance.but i want insert one by one record to write a log. – PNG Aug 03 '17 at 11:56
  • If you're going to be logging many, many messages a second, you can probably batch it so that it inserts them whenever there's been more than a certain number of entries, or every X seconds. If not, that might not be a problem. Try it out with your real usage (actual logging by the actual task that's going to be doing all this logging) and optimize it if it turns out to be a problem. – Jesper Aug 03 '17 at 12:21