So that would require client-side code. But not entirely clear what you are wanting to measure. From smallest to largest, the timings could be
- time inside server application - measured by code which you already have.
- Your code can set the start from either the "Now()" when it begins, or using the HTTP objects. The first call to a site would see a big difference between these start times, otherwise they should be almost identical.
- time on server website - I believe this is already measured by most hosting services like IIS.
- server machine - I believe this is what "mo" is referring to. You would have to have some kind of external monitoring on the server machine, ala WireShark.
- client machine - again, you would have to have some kind of external monitoring on the client machine. This would be the hardest to get, but I think is really what you are asking for.
- client application - this is what you can measure with javascript.
Unless this is the "first call" (see Slow first page load on asp.net site or ASP.NET application on IIS7 - very slow startup after iisreset), I believe that all of these time will be just so close that you can use a "good enough" approach instead.
If you must have a measure of this call's client time, then you are stuck in a bad spot. But if you just want better numbers, just continue to measure 1. (application time) with what you already have, and make sure to also measure the size of the request and response.
Then set a base-line for adjusting that time, by testing on various target client machines.
- Measure ping times from the client to your server
- Measure transfer times of moderately large content - both upload and download
- Finagle the numbers to get your average adjustment
You should end up with a formula like:
[AdjustedTime] = [PingTime] + [ServerTime]
+ ([UploadSpeed] * [RequestSize])
+ ([DownloadSpeed] * [ResponseSize]);
This would be the expected client response time.