There's a time critical application that handles messages from a trading server where we get around 10K msgs per second... There are times when the application tends to be taking a lot of time in doing the inserts in the database... After several days of going back and forth with the dev team about which side is taking the time, our db team decided to build a simple C# app that resides on a server on the same rack and the same network as the database server. The database in question is sql server 2012 standard.
The times were taken from ado.net this way...
var currT1 = DateTime.Now;
sqlComm.ExecuteNonQuery();
var loadT = DateTime.Now;
The times from sql server were taken from the startTime and endTime columns from a server side trace... The two servers are time-synched but there's a differences of 10-15 ms...
Now what's making me want to bang my head on something is that while it's understandable the application takes longer than the db (cuz it has to do processing as well as other stuff)... But in some cases, the DB reports it took 4 ms, but the app says it took zero ms!!
I definitely think the bug is with the test app... But there's nothing separating the db call and the two timestamps... The log reads like this... App times (start, end, diff, method) followed by db calls (starttime, endtime, diff)
10:46:06.716
10:46:06.716
0:00:00.000
DataAdapter
10:46:06.697
10:46:06.700
0:00:00.003
is there something else I should provide?