My datagram socket is not throwing a SocketTimeout despite it being set and I'm not sure how to resolve this. The code is as follows. The thing is, if it doesn't receive any messages along this socket connection, it will timeout on the first run through. However, it successfully receives a message a couple of times, it won't timeout later on when a .receive is called.
DatagramSocket serverSocket = new DatagramSocket(serverSyncPort);
serverSocket.setSoTimeout(200);
while(true)
{
receiveData = new byte[1024];
receivePacket = new DatagramPacket(receiveData,receiveData.length);
try
{
serverSocket.receive(receivePacket);
}
catch(SocketTimeoutException e) {}
}