I have written a pretty basic application in C# (.NET Compact Framework 2.0) using UDP sockets.
The program works fine for awhile (up to a couple weeks at a time), but always fails eventually. On top of my clients not being able to reconnect, this bug seems to adversely kill all activity from the associated NIC. Once this happens, I am no longer able to remote into the device (using CE Remote Display) - which is my only means of getting additional feedback for debugging. So at this point, I am not 100% certain whether the application itself crashes, or I am breaking something within the operating system via my socket code.
I have implemented an unhandled exception event that never gets raised. I also have a number of try/catch blocks that would output the exception message to a text file. I am not seeing any exceptions being thrown.
/// Removed old TCP code.
The clients themselves are simple little gateway devices that are configured as UDP servers. This is a remote system that I have access to sparingly, and although I have a test controller and gateway unit, the conditions are not identical and I have not yet been able to reproduce the issue on my end.
TIA for any feedback.
Edit:
I've been running with my test bench demo and periodically checking netstat on the server per some comment suggestions. In CE5 netstat does not take the -a flag so I've been using -n (not sure if this is going to tell me what I need...). I have been disconnecting and reconnecting my clients several times, forcing half-opens by unplugging Ethernet, etc. and the netstat table is only showing one connection per client (at the appropriate ports).
Edit 2:
Due to the sparse nature of the messaging during production, I changed the application over to connectionless UDP messaging, but I am still experiencing the same behavior (with about the same amount of time to failure). On my test hardware, the application runs successfully indefinitely with a high rate of messages (once every few seconds). However, in production where messages would be a lot less frequent, the program fails after running for about 10 days. I wouldn't think inactivity would matter, but perhaps I've got that wrong? Looking for any suggestions I can get.
New Send/Receive code:
public void Send(string Message)
{
Socket udpClient = new Socket(AddressFamily.InterNetwork, SocketType.Dgram, ProtocolType.Udp);
EndPoint ep = new IPEndPoint(IPAddress.Parse(_ipAddress), _port);
udpClient.Connect(ep);
byte[] data = Encoding.ASCII.GetBytes(Message);
// async send, sync receive
udpClient.BeginSendTo(data, 0, data.Length, SocketFlags.None, ep, (ar) =>
{
try
{
udpClient.EndSendTo(ar);
_lastSent = Message;
string msg = this.ReceiveSync(udpClient, 3);
if (!string.IsNullOrEmpty(msg))
{
_lastReceived = msg;
DataReceived(new ReceiveDataEvent(_lastReceived));
}
}
catch { }
finally
{
udpClient.Close();
}
}, null);
}
private string ReceiveSync(Socket UdpClient, int TimeoutSec)
{
string msg = "";
byte[] recBuffer = new byte[256];
int elapsed = 0;
bool terminate = false;
do
{
// check for data avail every 500ms until TimeoutSecs elapsed
if (UdpClient.Available > 0)
{
int bytesRead = UdpClient.Receive(recBuffer, 0, recBuffer.Length, SocketFlags.None);
msg = Encoding.ASCII.GetString(recBuffer, 0, recBuffer.Length);
terminate = true;
}
else
{
if ((elapsed / 2) == TimeoutSec)
terminate = true;
else
{
elapsed++;
System.Threading.Thread.Sleep(500);
}
}
} while (!terminate);
return msg;
}