I need to measure a one direction latency measurement between two applications that communicate thru a LAN and report the result to a data collection server.
The client application sends the data using multicast, then it passes thru two servers and the last server is the end point of this test, like so:
Agent -> multicast cloud -> server 1 -> server 2
I thought about using NTP (or PTP for LAN) to synchronize "agent" and "server 2", but I wonder what's the right algorithm to implement this and what would be its precision.
How can I perform this measurement (using C#)? and what would its precision be?
UPDATE: note that the data is being processed between agent and server 2, so the measurement is not purely network-wise.