We are showing how much time a particular task was in opened state using timer. For this we need the difference between (Task opened time and current time). The problem is Task opened time is captured in Server Time, and the current time is client time. Client is running 25 seconds slower than server, so when showing task timer for first 25 seconds, the timer runs in negative. To avoid this instead of getting the current time from client, I am planning to get the current time from server after that i will start adding seconds to the server time. However I am not sure how to get the server current time, because I have to send a request to server to get the time, (while receiving it will not be accurate, because of network latency, there could be 1-2 seconds difference). How to avoid this.
Asked
Active
Viewed 326 times
1 Answers
0
One way out for you is to use Network Time Protocol and fetch the accurate times from the various time servers available. And then you can use them as per your need. See this for more details.
The other logical way out is to have some property called TimeTaken
in your WCF service's data contract (essentially somewhere in the Result of your WCF operation).
Next, note the Start
and End
Times of your WCF Task
(in WCF itself) and afterwards calculate the difference between them and set the TimeTaken
property.
At the client end, you would now have the actual time taken for the Task
in the WCF service and you could simply add this up with the CurrentTime
at the client to give you the desired result.