I am creating an automated test fixture in Visual Studio that communicates with a hardware device over serial. I want to communicate with the device using the write, sleep, read approach. I'm well aware that the overwhelming consensus says to use the data received event handler. I've tried, and from what I've read (and observed) the function seems to be very unreliable. Specifically, it seems as though each individual byte is intermittently generating it's own separate DataReceived event, rather than triggering one event for the entire 26 byte sequence, as someone else suggested in this thread: https://www.codeproject.com/Questions/1178661/How-do-I-receive-a-byte-array-over-the-serial-port
This article claims that the data received event handler is entirely inconsistent altogether:
https://www.sparxeng.com/blog/software/must-use-net-system-io-ports-serialport
I have a scope setup to measure the time delay between the start of the outgoing packets, and the incoming packets and I'm very confident that this approach will work. In fact, the approach does work, barring the inconsistent time delay generated by Thread.Sleep(). When Thread.Sleep produces a delay that's within a few mS of the nominal value requested, the incoming data stream arrives perfectly. Otherwise, not the case. I was able to verify with my scope that when my incoming data stream comes in wrong, it was specifically because Thread.Sleep() produced a delay greater than what I specified.
Edit: Above statement is not true. I was simply reading too soon, simply increasing the delay resolved my issue. Updated with more info at end of post.
This is the code responsible for setting up the outgoing 26 byte packet, then reading a response:
private void enableRemoteMode()
{
if (serialPort2.IsOpen)
{
byte[] byteRemoteEnable = new byte[] { 0xAA, 0x00, 0x20, 0x01, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xCB};
serialPort2.Write(byteRemoteEnable, 0, 26);
Thread.Sleep(55);
byte[] buffer = new byte[26]; //Declare an array of size int bytes
serialPort2.Read(buffer, 0, 26); //Store incoming byte[] array in buffer of size int bytes
dataIn2 = BitConverter.ToString(buffer);
//Display the incoming response packet to console
Debug.WriteLine("");
Debug.WriteLine("");
Debug.WriteLine("Debug: Enable Remote Mode Command Response Packet");
Debug.WriteLine(dataIn2);
}
}
As I'm sure many of you are aware, the amount of time delay generated by Thread.Sleep(100) is wildly inconsistent. There are numerous threads explaining how the maximum resolution of the standard timers is 16mS due to operating system limitations, overhead, etc.
I found this thread called "Most accurate timer in .NET?" where the user in the top answer created a class wrapper for the Multimedia Timer API: Most accurate timer in .NET?
I copied their code and created a class called AccurateTimer.cs, but I'm unsure how to use it.
// AccurateTimer.cs
using System;
using System.Windows.Forms;
using System.Runtime.InteropServices;
namespace YourProjectsNamespace
{
class AccurateTimer
{
private delegate void TimerEventDel(int id, int msg, IntPtr user, int dw1, int dw2);
private const int TIME_PERIODIC = 1;
private const int EVENT_TYPE = TIME_PERIODIC;// + 0x100; // TIME_KILL_SYNCHRONOUS causes a hang ?!
[DllImport("winmm.dll")]
private static extern int timeBeginPeriod(int msec);
[DllImport("winmm.dll")]
private static extern int timeEndPeriod(int msec);
[DllImport("winmm.dll")]
private static extern int timeSetEvent(int delay, int resolution, TimerEventDel handler, IntPtr user, int eventType);
[DllImport("winmm.dll")]
private static extern int timeKillEvent(int id);
Action mAction;
Form mForm;
private int mTimerId;
private TimerEventDel mHandler; // NOTE: declare at class scope so garbage collector doesn't release it!!!
public AccurateTimer(Form form,Action action,int delay)
{
mAction = action;
mForm = form;
timeBeginPeriod(1);
mHandler = new TimerEventDel(TimerCallback);
mTimerId = timeSetEvent(delay, 0, mHandler, IntPtr.Zero, EVENT_TYPE);
}
public void Stop()
{
int err = timeKillEvent(mTimerId);
timeEndPeriod(1);
System.Threading.Thread.Sleep(100);// Ensure callbacks are drained
}
private void TimerCallback(int id, int msg, IntPtr user, int dw1, int dw2)
{
if (mTimerId != 0)
mForm.BeginInvoke(mAction);
}
}
}
They do provide an example and I was successful in running it, but it's not quite what I need. Specifically their example is setup for periodic time delays, whereas I need single instance delays.
// FormMain.cs
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
namespace YourProjectsNamespace
{
public partial class FormMain : Form
{
AccurateTimer mTimer1,mTimer2;
public FormMain()
{
InitializeComponent();
}
private void FormMain_Load(object sender, EventArgs e)
{
int delay = 10; // In milliseconds. 10 = 1/100th second.
mTimer1 = new AccurateTimer(this, new Action(TimerTick1),delay);
delay = 100; // 100 = 1/10th second.
mTimer2 = new AccurateTimer(this, new Action(TimerTick2), delay);
}
private void FormMain_FormClosing(object sender, FormClosingEventArgs e)
{
mTimer1.Stop();
mTimer2.Stop();
}
private void TimerTick1()
{
// Put your first timer code here!
}
private void TimerTick2()
{
// Put your second timer code here!
}
}
}
All of the other examples I've found online seem to be the same story. A demo that performs a console.write every n mS or something similar. I was hoping one of you might now how I would be able to utilize the Multimedia Timer API to perform a more accurate time delay in place of Thread.Sleep() in my code above.
Edit / Update:
I was successful in implementing the write, sleep read approach as described above without any special classes or libraries. I had a fundamental misunderstanding about the way serial ports function. Allow me to elaborate:
When you look at the physical data lines on the serial signal, there is a delta t between sending and receiving a packet. I had wrongly assumed that once the incoming packet had begun to transmit, if my software did not start reading the packet already, I had missed my window of opportunity.
Thanks to a coworker, I learned that there is actually a FIFO (first in first out) buffer present on every serial device that stores the incoming data. Turns out, I was attempting to read from the buffer too quickly. All I did was increase the length of Thread.Sleep from 55mS (the aforementioned delta t above) to ~200 mS and everything works perfectly.
This does not answer my original question, but if you, like me, are a hardware engineer attempting to write a program that communicates with a hardware device over serial, know that you can still read the incoming data from the hardware buffer well after the data has been transmitted.
I'm not quite sure why everyone is so emphatic on using the data received event handler, but if you're trying to talk to one serial device at a time, once every few seconds ( as I am ) I see no reason why this approach won't work.