I have a client-server architecture and I am sending messages from the client to the server every 70ms. On the server side, I want to introduce a simulated network latency. To do this, I have the following:
//Receive incoming messages and put into temp queue.
private void DataReceived(object sender, DataReceivedEventArgs args)
{
TemporaryRawMessages.Enqueue(args.Bytes);
args.Recycle();
}
//Runs on separate thread
void ProcessMessages()
{
while (true)
{
if (TemporaryRawMessages.Count > 0)
{
var raw = TemporaryRawMessages.Dequeue();
Task taskA = Task.Factory.StartNew(() => DelayedReceive(raw));
}
}
}
//Process the received message
void DelayedReceive(byte[] raw_message)
{
//wait a bit before deserialising and queueing the message
while (sw.ElapsedMilliseconds < LatencyUp) { }
var message = (ClientToServerMessage)Utils.Deserialize(raw_message);
Messages_In.Enqueue(message);
}
The server receives a message and puts it into a temporary queue.
Next, ProcessMessages()
picks up any new messages and spawns a new Task. The task waits a bit before putting the message into another queue, which is picked up by the main program.
The problem is that if I time DelayedReceive, I see timings such as: 3, 241, 46, 99,... There is no consistency. Each message should be delayed by 70ms, offset by the network latency up. In other words, all messages are delayed by X (eg 100ms), but are equally spaced apart at 70ms.
Any ideas on what the issue is?