0

I have developed an application in Visual C++ 2008 to read data periodically (50ms) from a COM Port. In order to periodically read the data, I placed the read function in an OnTimer function, and because I didn't want the rest of the GUI to hang, I called this timer function from within a thread. I have placed the code below.

The application runs fine, but it is showing the following unexpected behaviour: after the data source (a hardware device or even a data emulator) stop sending data, my application continues to receive data for a period of time that is proportional to how long the read function has been running for (EDIT: This excess period is in the same ballpark as the period of time the data is sent for). So if I start and stop the data flow immediately, this would be reflected on my GUI, but if I start data flow and stop it ten seconds later, my GUI continues to show data for 10 seconds more (EDITED).

I have made the following observations after exhausting all my attempts at debugging:

  1. As mentioned above, this excess period of operation is proportional to how long the hardware has been sending data.
  2. The frequency of incoming data is 50ms, so to receive 10 seconds worth of data, my GUI must be receiving around 200 more data packets.
  3. The only buffer I have declared is abBuffer which is just a byte array of fixed size. I don't think this can increase in size, so this data is being stored somewhere.
  4. If I change something in the data packet, this change, understandably, is shown on the GUI after a delay (because of the above points). But this would imply that the data received at the COM port is stored in some variable sized buffer from which my read function is reading data.
  5. I have timed the read and processing periods. The latter is instantaneous while the former very rarely (3 times in 1000 reads (following no discernible pattern)) takes 16ms. This is well within the 50ms window the GUI has for each read.

The following is my thread and timer code:

UINT CMyCOMDlg::StartThread(LPVOID param)
{
    THREADSTRUCT *ts = (THREADSTRUCT*)param;

    ts->_this->SetTimer(1,50,0);

    return 0;

}

//Timer function that is called at regular intervals
void CMyCOMDlg::OnTimer(UINT_PTR nIDEvent)
{       

    if(m_bCount==true)
    {
        DWORD NoBytesRead;
        BYTE  abBuffer[45];

        if(ReadFile((m_hComm),&abBuffer,45,&NoBytesRead,0))
        {
            if(NoBytesRead==45)
            {
                if(abBuffer[0]==0x10&&abBuffer[1]==0x10||abBuffer[0]==0x80&&abBuffer[1]==0x80)
                {
                    fnSetData(abBuffer);
                }
                else
                {
                    CString value;                      
                    value.Append("Header match failed");
                    SetDlgItemText(IDC_RXRAW,value);    
                }
            }
            else
            {
                CString value;
                value.Append(LPCTSTR(abBuffer),NoBytesRead);
                value.Append("\r\nInvalid Packet Size");
                SetDlgItemText(IDC_RXRAW,value);
            }
        }
        else
        {
            DWORD dwError2 = GetLastError();
            CString error2;
            error2.Format(_T("%d"),dwError2);
            SetDlgItemText(IDC_RXRAW,error2);
        }

        fnClear();
    }
    else
    {
        KillTimer(1);
    }

    CDialog::OnTimer(nIDEvent);

}

m_bCount is just a flag I use to kill the timer and the ReadFile function is a standard Windows API call. ts is a structure that contains a pointer to the main dialog class, i.e., this.

Can anyone think of a reason this could be happening? I have tried a lot of things, and also my code does so little I cannot figure out where this unexpected behaviour is happening.

EDIT:

I am adding the COM port settings and timeouts used below :

            dcb.BaudRate = CBR_115200;
            dcb.ByteSize = 8;
            dcb.StopBits = ONESTOPBIT;
            dcb.Parity = NOPARITY;
            SetCommState(m_hComm, &dcb);

            _param->_this=this;

            COMMTIMEOUTS timeouts;
            timeouts.ReadIntervalTimeout=1;
            timeouts.ReadTotalTimeoutMultiplier = 0;
            timeouts.ReadTotalTimeoutConstant = 10;
            timeouts.WriteTotalTimeoutMultiplier = 1;
            timeouts.WriteTotalTimeoutConstant = 1;
            SetCommTimeouts(m_hComm, &timeouts);
  • Data from the COM port is stored in the system buffer by the OS. Your application program does not read data directly from the hardware, but rather just fetches data from the intermediate system buffers. *"The frequency of incoming data is 50ms"* -- Frequency is a rate (e.g. X per second), but you're only mentioning a time interval. – sawdust Jul 25 '17 at 08:18
  • @sawdust I apologise for not clarifying the frequency. It is 20 times a second (50ms between consecutive data packets). I assumed the `ReadFile` API call would read from the port. Even if this is not the case, why would a delay be caused if the read started immediately as data is sent? Could it be something to do with the timer function? – EagerLearner Jul 25 '17 at 08:20
  • @sawdust additionally, regarding the system buffer, if the hardware was sending data before my application started reading will all the data before the app officially started reading also be read? – EagerLearner Jul 25 '17 at 08:26
  • *"if the hardware was sending data before ..."* -- The OS typically allocates this system buffer when the device is opened, so whatever is received prior is not "read". More likely it's the data rate. Using a timer to read is redundant when you can simply use a blocking read to wait for the read to complete. That assumes that the data packets are actually sent every 50ms. But since you see an additional run of data, that implies that your imposed read rate of 20 times per second is slower than the actual rate that the data was sent. Hence the program is slowly creating a back flow. – sawdust Jul 25 '17 at 08:42
  • @sawdust Thank you for the reply. I check the average interval between the system calling the read function. It turns out that this is around 60ms. So Windows just makes sure that the interval is greater than the time I specify. Is there any way to better this implementation. For example, how would I use `WaitCommEvent` to periodically read, if the read function would get over after the first read? – EagerLearner Jul 25 '17 at 09:08
  • I don't do Windows programming, so I'm guessing here. See the first comment to https://stackoverflow.com/questions/25996171/linux-blocking-vs-non-blocking-serial-read/26006680#26006680. I presume that the timer *plus* the timed read is slowing the effective read rate, so that the program is building a backflow. Either get rid of the timer or change the read *"to return immediately with the bytes that have already been received"*. – sawdust Jul 25 '17 at 09:35

1 Answers1

0

You are processing one message at a time in the OnTimer() function. Since the timer interval is 1 second but the data source keeps sending message every 50 milliseconds, your application cannot process all messages in the timely manner.

You can add while loop as follow:

while(true)
{
    if(::ReadFile(m_hComm, &abBuffer, sizeof(abBuffer), &NoBytesRead, 0))
    {
        if(NoBytesRead == sizeof(abBuffer))
        {
            ...
        }
        else
        {
            ...
            break;
        }
    }
    else
    {
        ...
        break;
    }
}

But there is another problem in your code. If your software checks the message while the data source is still sending the message, NoBytesRead could be less than 45. You may want to store the data into the message buffer like CString or std::queue<unsigned char>.

If the message doesn't contain a NULL at the end of the message, passing the message to the CString object is not safe.

Also if the first byte starts at 0x80, CString will treat it as a multi-byte string. It may cause the error. If the message is not a literal text string, consider using other data format like std::vector<unsigned char>.

By the way, you don't need to call SetTimer() in the separate thread. It doesn't take time to kick a timer. Also I recommend you to call KillTimer() somewhere outside of the OnTimer() function so that the code will be more intuitive.

If the data source continuously keeps sending data, you may need to use PurgeComm() when you open/close the COMM port.

JazzSoft
  • 392
  • 2
  • 11
  • Thank you for the reply! I apologise for the mistake but the timer period was set to 1000ms only for my debugging purposes. During normal run it is 50ms, I have made that change in the code above. I realised the risk in storing the data in a `CString` and that's why I used a byte array `abBuffer`, and because I explicitly passed an argument to the `ReadFile` function asking it to read till 45 bytes, I don't think the check will happen between data packets. The reason I called the timer function from a thread was because I needed the GUI to still be responsive during the read, and (...contd) – EagerLearner Jul 26 '17 at 03:29
  • (..contd) by pushing all the reading and processing tasks to a worker thread freed up the rest of the GUI. This is also why I called the `KillTimer` function from within the timer, although I could have called it from my thread function. The issue I feel is the inaccuracy in the `WM_TIMER` message used by `SetTimer`. This just ensures that the lower limit of the period is equal to the time set, in actuality this period could be between 50ms and 70ms. I am trying to implement an event based read. If you have any suggestions in that respect I would greatly appreciate that!! – EagerLearner Jul 26 '17 at 03:34
  • SetTimer() and KillTimer() return immediately. It doesn't wait at all. – JazzSoft Jul 27 '17 at 06:35