1

i load and add to my local database 700 json webpages like this :

private static Api.Root LoadJsonPages(int PageNbr)
{
    Uri url = new Uri("www.website.com//page=" + PageNbr);
    string info = new WebClient().DownloadString(url);
    result = JsonConvert.DeserializeObject<Api.Root>(info);
    return result;
}

private void LoadandAdd()
{
    Api.Root result = new Api.Root();
    cn.Open();
    for (int i = 0; i < 700; i++)
    {
         result = LoadJsonPages(i);
         for(int j = 0; j < result.NbrObject; j++)
         {
                StringBuilder sqlStr = new StringBuilder("INSERT into MyDatatable values (@Data1, @Data2, @Data3)");
                SqlCommand cmd = new SqlCommand(sqlStr.ToString(), cn);
                cmd.Parameters.Add(new SqlParameter("@Data1", result.items[j].Data1.value.ToString()));
                cmd.Parameters.Add(new SqlParameter("@Data2", result.items[j].Data2.value.ToString()));
                cmd.Parameters.Add(new SqlParameter("@Data3", result.items[j].Data3.value.ToString()));
                cmd.ExecuteNonQuery();
         }
    }
    cn.Close(); 
}

This code working but very slow, it takes 20mn to perform this (700 pages / 17500 objects = 25 objects/pages)

Separately load 700 pages and deserialize takes 10mn and Add it to database 10mn

how i can do it faster ?

KTG
  • 53
  • 3
  • If I would have to guess, the web server is the slow thing here. You can't really make that any faster from the client end. All you can do on the client end is to send more HTTP requests in parallel. (BTW, building a new SqlCommand object in every loop iteration directly defies the purpose of SqlCommand objects. Move that out of the loop.) – Tomalak May 14 '17 at 09:22
  • To quickly insert a lot of data into SQL Server, look here: http://stackoverflow.com/questions/13722014/insert-2-million-rows-into-sql-server-quickly. – Tomalak May 14 '17 at 09:27
  • 1.7 seconds per page including navigating and getting 25 objects. Not bad. I would use a sniffer like wireshark or fiddler to get the actual times of transfer. There may be certificates that need to be obtained which may be slowing down transfer rate. – jdweng May 14 '17 at 09:39
  • 1
    @Tomalak thanks for your link and advices, the time to perform the update decrease significantly +1 – KTG May 16 '17 at 11:49
  • Good to hear! If you have improved code, please post it as an answer so that this thread can be closed as solved. – Tomalak May 16 '17 at 13:58
  • Now my time for perform it is 10mn, most of the time is consumming by the http request and deserialization. – KTG May 16 '17 at 20:06
  • but bulkcopy return a common error for duplicate primary key and don't finish the insert command... there are no duplicate key in my first datatable. i try to fix it – KTG May 17 '17 at 10:21

0 Answers0