I am looking for reference and pointers on setting vmin and vtimer for recieving streaming serial application. MEssages of varying length are send to the application in at random timing but at a fixed frequency for each message( some messages could be coming at 1 sec while others can be as fast as 10 ms).
There is no sender protocol (no sync byte pattern for telling that the signal byte i recieved is a message nor telling me the length of the message). Though I have the header byte pattern which will tell me the message type and consequently their length. Thus in my programming, I try to read in a fixed number of bytes to a ringbuffer and then subsequently try to reverse locate and match the header from the buffer rear (let say i read in n bytes, I would start to match my byte pattern at n-1 buffer and then n-2... so on and so forth)
Now I am trying to determine to vtime and vmin settings i should set... now i have set them to vtime =1 and vmin =256. I wonder whether this is a correct settings... but i feel like setting both values to be 0, apart from fear of serial read using polling and increase my cpu usages
Does the setting of values also matter with cycle time... ie meaning each read time... if I have a faster read time than vtime, will this affect perfomance and how could i set lower timing than now (0.1s)
Need some pointers and direction