-3

The basic idea is every row has a unique item id which I'm using to decide which row to update, and am updating it with a quantity. I'm importing inventory from a company's api into my own database, and there are about 80k items. They have separate api endpoints for item details vs item quantities, so I need to get all the inventory from the first endpoint first, then call the second and update all the items' quantities.

My initial solution was to just loop though the results I get back and update one row per query. This is taking about an hour to an hour and a half to update all 80k items. Is this just how long it takes to update so many items or is there any sql magic I could do?

Thanks

saint_foo
  • 75
  • 9
  • can't you use a SQL `where` clause the decide which rows to update. SQL isn't really supposed to be used like this. – Danny Cullen Nov 25 '16 at 15:03
  • 1
    http://stackoverflow.com/a/20635881/284240 – Tim Schmelter Nov 25 '16 at 15:05
  • @DannyCullen Yes that's what I said I was doing and the performance is terrible for that many records. How is sql not meant to be used to update rows in a db? – saint_foo Nov 25 '16 at 20:48
  • @TimSchmelter thanks this is just what I was looking for, the performance difference is amazing, it takes like 30 seconds now. – saint_foo Nov 25 '16 at 20:48
  • 1
    Does this answer your question? [Bulk Update in C#](https://stackoverflow.com/questions/20635796/bulk-update-in-c-sharp) – TAbdiukov Feb 10 '21 at 01:46

2 Answers2

1

The first thing I would do is to store the update data in a separate table. Then run a query such as:

update t
    set . . .
    from t join
         results r
         on t.?? = r.??;

Assuming the column used for the join is indexed, then this should be faster than an hour and a half.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786
  • Yet another DB which is not MySQL that supports hash join. Not just that Indexes are not needed, if used for large volumes they will kill the performance. For 80k rows, assuming a "standard" average row size, the update will take less than a second. – David דודו Markovitz Nov 25 '16 at 16:42
0

You can use Parallel.ForEach to build update script first then execute the script by means of a SqlCommand. The update script consists of multiple update T-SQL statements.

Or using the best library of SQL client assembly. It is the SqlBulkCopy feature. You can refer to this article: http://www.developerfusion.com/article/122498/using-sqlbulkcopy-for-high-performance-inserts/

Using SqlBulkCopy to insert large data to a temp table first then write a stored procedure to update data from temp table to destination table.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Hao Vo
  • 188
  • 1
  • 6