I have 3 programs talking to each other. One of my programs is giving me a strange issue, which is causing my latency measurements to be off.
All 3 applications use the same Custom communications library...
In the problematic application, when the program starts, a new task is started to continuously watch a queue for the arrival of a message:
public void Init()
{
var ip = "127.0.0.1";
CommunicationsManager.UdpManager.Initialize(ip, (int)UDPPort.FromUnityToPredictor, (int)UDPPort.FromPredictorToUnity);
CommunicationsManager.UdpManager.UdpMessageReceived += OnMessageReceived;
CommunicationsManager.UdpManager.StartListening();
var task = Task.Run(() =>
{
ProcessMessageQueueThread();
});
// Thread thread = new Thread(ProcessMessageQueueThread);
// thread.IsBackground = true;
// thread.Start();
}
private void ProcessMessageQueueThread()
{
while (true)
{
if (MessageQueue.Count > 0)
{
ProcessMessage();
}
}
}
I a have a function subscribed to an event listener, which fires upon the arrival of a new UDP datagram:
private void OnMessageReceived(object sender, UDPMessage message)
{
MessageQueue.Enqueue(message);
//Task.Run(() =>
//{
//ProcessMessage();
//});
}
Upon the function firing, the message is added to a BlockingCollection
queue.
The message is then processed by ProcessMessage
:
static Stopwatch sw = new Stopwatch();
private void ProcessMessage()
{
var message = MessageQueue.Dequeue();
messagesReceived++;
sw.Restart();
//if server side prediction, delay before reading...
if (!message.ClientSidePrediction)
{
// Thread.Sleep(message.UpDelay);
Task.Delay(message.UpDelay).Wait();
}
//If prediction must not be used...
if (!message.UsePrediction)
{
message.IsAPredictedMessage = false;
CommunicationsManager.UdpManager.Send(message);
messagesSent++;
return;
}
if (message.UsePrediction&& messagesReceived == 1)
{
message.IsAPredictedMessage = false;
CommunicationsManager.UdpManager.Send(message);
logger.Info("First message sent:" + sw.ElapsedMilliseconds);
sw.Restart();
messagesSent++;
}
message.IsAPredictedMessage = true;
CommunicationsManager.UdpManager.Send(message);
logger.Info("second message sent:" + sw.ElapsedMilliseconds);
messagesSent++;
// Console.WriteLine("Sent:" + messagesSent + ", Received: " + messagesReceived);
}
As can be seen, I am simulating a network delay in `ProcessMessage':
Task.Delay(message.UpDelay).Wait();
If I set message.UpDelay
to 30, I will see a delay between sending messages of 45ms or 46ms... if I set the message.UpDelay
to 1, I will see a sending delay of 15ms or 16ms...
However, most of the time the delay is correct... ie, 30ms/31 ms, or 1ms/2ms. Is seems that at some random (at least, to me it appears random) time, the delay takes an additional 15ms/16ms to complete...
What is going on? It is very important that my delays are consistent and accurate. I have tried using Thread.Sleep
and creating threads rather than using tasks (can be seen in the code comments), but I do not see a difference.