1

I am reading data frames from a device using a software UART driver. It's written as a tty module so reading from is is pretty standard:

struct termios options;
int tty = open(path, O_RDWR | O_NOCTTY  | O_SYNC);
tcgetattr(tty, &options);
options.c_cflag = B4800 | CS8 | CLOCAL | CREAD | PARENB;
options.c_iflag = IGNPAR;
options.c_oflag = 0;
options.c_lflag = 0;
tcsetattr(tty, TCSANOW, &options);
while (running)
{
    int rx_count = read(tty_device, (void*)header, header_size);
    // parse the header to figure out the total frame size
    rx_count = read(tty_device, (void*)frame_data, frame_size);
}

It works most of the time, but sometimes I miss some bytes due to limitations in the reliability of the UART.

My problem is that when I miss bytes the code reads the initial bytes of the next frame as the final bytes of the previous frame. This causes random, unpredictable behavior.

I am wondering if there is a good way to read based on time, instead of data size. Something like this:

while (running)
{
    // read bytes continuously from the tty device
    // if 1 second passes with no further bytes coming, stop reading
    // process whatever data was read
}

I could probably hack together something that does this eventually but I imagine there is a good chance this has already been figured out, and I am just failing to find the info on the internet.

ukyo_rulz
  • 63
  • 5
  • Possible duplicate of [How to implement a timeout in read function call?](https://stackoverflow.com/questions/2917881/how-to-implement-a-timeout-in-read-function-call) – hessam hedieh Jan 21 '19 at 09:56
  • Do the frames you read have a specific format? Perhaps with a start-of-frame and end-of-frame markers that are unique and easily recognizable? Then you could read byte by byte from the start-of-frame until the end-of-frame, and if you find a start-of-frame marker before the end-of-frame, then you know the packet was corrupted. – Some programmer dude Jan 21 '19 at 09:56
  • Considering that you cannot modify the structure of the frame, a timeout could be useful (http://man7.org/linux/man-pages/man2/select.2.html) – Jose Jan 21 '19 at 09:56
  • You can also make the descriptor non-blocking and read until there's an error with `errno == EWOULDBLOCK`. Possibly combine with my previous suggestion. – Some programmer dude Jan 21 '19 at 09:57
  • @Someprogrammerdude unfortunately, though there is a frame start marker, it's just one byte and this byte may also appear in the frame data. – ukyo_rulz Jan 21 '19 at 10:00
  • @Jose I was thinking something similar as well, but was wondering if there was a better way. – ukyo_rulz Jan 21 '19 at 10:02
  • *"sometimes I miss some bytes due to limitations in the reliability of the UART"* -- No, do not blame the UART unless you can prove that UART overruns are actually occurring. FYI your code "reads" from a buffer, not the UART. Your termios configuration is suboptimal and incomplete. You apparently have no idea if it's canonical mode or raw mode. See https://stackoverflow.com/questions/25996171/linux-blocking-vs-non-blocking-serial-read/26006680#26006680 – sawdust Jan 22 '19 at 00:03
  • @sawdust It's not a UART but a software UART, and I know it's missing bytes because I see it mistiming reads on my protocol analyzer. – ukyo_rulz Jan 22 '19 at 05:04

1 Answers1

1

I was able to accomplish what I need using VMIN and VTIME as found in the link from the comment of sawdust: Linux Blocking vs. non Blocking Serial Read

This was also relevant to my problem. Somehow I couldn't figure out the right search terms to find it before: VMIN and VTIME Terminal Settings for variable sized messages

ukyo_rulz
  • 63
  • 5