0

Running on Windows 10 with .NET 4.7.2 and virtual serial ports, I have a SerialPort DataReceived handler that looks like:

private void _serialPort_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
    _serialPort.ReadTimeout = 10000;
    while (true)
    {
        try
        {
            var nextByte = _serialPort.ReadByte();
            Debug.WriteLine(((byte)nextByte).ToString("X2"));
            continue;
        }
        catch (TimeoutException)
        {
        }

        break;
    }
}

If I write a byte to the port it appears immediately in the debug window, and because of the while loop and continue, ReadByte() is called again where it will block until the 10 second timeout elapses. If I write more data to the port during that time, ReadByte() does not return immediately. Instead, it continues to block until the full timeout has elapsed, and only then returns the newly written byte.

Why would it not return as soon as new data is available?

Dan
  • 1,215
  • 1
  • 10
  • 22
  • Make sure you do not use hardware or software handshaking they are archaic and could explain the results. The cable you are using could also explain issue. You have to read the manual for the device to see exactly what is required. The device may not send anything until a full message is sent. The device may not echo characters and that is why you are not seeing anything. – jdweng Feb 04 '19 at 03:24
  • 1
    Why use `while(true)` cycle in `SerialPort.DataReceived` event handler instead of single read of `SerialPort.BytesToRead` bytes? See [Top 5 SerialPort Tips](https://blogs.msdn.microsoft.com/bclteam/2006/10/10/top-5-serialport-tips-kim-hamilton/) article by Kim Hamilton – Leonid Vasilev Feb 04 '19 at 08:36
  • That would seem to be the natural approach. However, experience has shown me that technique doesn't work - it causes bytes to be missed because BytesToRead is unreliable. – Dan Feb 04 '19 at 16:23
  • The answer is that what I'm asking it to do is beyond the capabilities of the .NET SerialPort class. It's simply this: if you call Read* before the bytes arrive and the DataReceived event is raised, the entire timeout duration must elapse before any data will be returned to you. – Dan Mar 05 '19 at 07:39

0 Answers0