-1

My multi-threaded Delphi application parses about 100k marketplace offers. Each worker thread writes parsed data to a remote SQL Server. Currently each thread parses 3-4 offers per second which means 10 threads fire about 35 calls-for-update to SQL Server. Every second.

The idea is to implement the optimized database writes – sort of a lazy bulk updates. Each thread accumulates 20-30 parsed offers and then writes them do database in a single pass. I assume that would be way more optimal and efficient than the current approach.

I would be happy to hear your general comments and suggestions as well as shedding some light on the techniques of lazy/delayed/chunky writes from Delphi app to SQL Server database.

Interface Unknown
  • 713
  • 10
  • 26
  • Which Delphi version? – Uwe Raabe Mar 25 '14 at 09:55
  • Delphi XE4. Sorry to always forget to mention that ;) – Interface Unknown Mar 25 '14 at 10:17
  • Maybe [MSSQL and large INSERT statements](http://blog.staticvoid.co.nz/2012/8/17/mssql_and_large_insert_statements) is helpful, best performance is reached using "SQL bulk copy" (direct stream to database) - however it has disadvantages too – mjn Mar 25 '14 at 10:55
  • see http://stackoverflow.com/questions/8635818/multiple-insert-statements-vs-single-insert-with-multiple-values/8640583#8640583 – mjn Mar 25 '14 at 10:57
  • And there's still good old-fashioned BULK INSERTS from a flat file into the database. With a large data transfer app I developed (years ago) this was by far the fastest solution. But that was before large insert statements, and it only works if you can delay to batches of at least 1000 rows. – Jan Doggen Mar 25 '14 at 18:49
  • @JanDoggen I think your approach is exactly my case. Never used BULK INSERT before but the point *"Each batch is copied to the server as one transaction"* [BULK INSERT (Transact-SQL)](http://technet.microsoft.com/en-us/library/ms188365.aspx) says it all. My data_struct is very simple. It contains just two fields: sku and price. The great idea would be to accumulate records to a local text file (which is in any case faster) and then fire them in a single bulk to SQL Server. Thanks for the idea! – Interface Unknown Mar 27 '14 at 15:12
  • Rewriting my comment as an answer ;-) – Jan Doggen Mar 27 '14 at 15:34

3 Answers3

0

Delphi XE4 contains FireDAC, which gives you two approaches for a solution: CachedUpdates and Array DML.

Uwe Raabe
  • 45,288
  • 3
  • 82
  • 130
  • Thank you for the proposed solution. In fact I'm not yet a FireDAC guy. I code databases with ADO since the earliest release. However it is a good reason for me to dig in deeply in FireDAC. – Interface Unknown Mar 27 '14 at 15:21
0

I like @Uwe's suggestion. If you are rolling your own solution without FireDAC, however, you can use an in-memory dataset as a buffer and then blast the data to a stored procedure.

Of course, this would require changes outside of the code, and you would need appropriate permissions to create the stored procedure and so forth. But if this idea appeals to you, here are two links that may help with this technique:

Community
  • 1
  • 1
James Jensen
  • 731
  • 1
  • 7
  • 14
  • James, thank you for the proposed solution. There are some really good techniques described. However I am now considering to use a bulk insert from client-side text file. In my case it would be the simplest approach as well as the best in terms of performance. – Interface Unknown Mar 27 '14 at 15:24
0

There's also good old-fashioned BULK INSERTS from a flat file into the database. With a large data transfer app I developed (years ago) this was by far the fastest solution. But that was before large insert statements, and it only works if you can delay to batches of at least 1000 rows.

Since you have only two very simple numeric fields you won't have to worry about Unicode, delimiters, escaping characters etc. Just write your intermediate results to a simple ASCII file, then BULK INSERT this in one transaction.

You will have to make sure this works multithreaded (should not be too difficult with unique file names), and you will have to experiment with the amount of 'latency' you tolerate, whether you can use table locks etc. The larger the bulk inserted file, the more you gain.

Make sure that you set the SQL server transaction logging to Minimal logging to prevent large transaction logs

Jan Doggen
  • 8,799
  • 13
  • 70
  • 144