I am writing a program that sends a 10-byte long packet to an Arduino Uno, using PySerial (The code for the Arduino and my computer are below). The way the code is supposed to work is that whenever a packet of bytes is sent to the Arduino, it echoes the bytes back to the computer. The host (my laptop) then reads that data, and prints it to the console (with a 100 millisecond delay). However, when I run the code, it slowly prints out ten null bytes, b''
, and with a delay of more than 100 milliseconds (more like 750 milliseconds). After this, it behaves as expected, printing out each element of the packet in sequence with 100 millisecond delays in between. Does anyone have an explanation of why this happens?
Python code:
import serial
import time
ser = serial.Serial(port="COM3", baudrate=9600, timeout=1.0)
packet = [85, 85, 1, 10, 1, 32, 3, 184, 11, 13]
while True:
p = bytes(packet)
ser.write(p)
for i in range(len(packet)):
time.sleep(0.1)
r = ser.read()
print(r)
Arduino code:
void setup() {
Serial.begin(9600);
}
void loop() {
if (Serial.available() > 0) {
int r = Serial.read();
Serial.write(r);
}
}
EDIT: Here is the console output when this program is run:
b''
b''
b''
b''
b''
b''
b''
b''
b''
b''
b'U'
b'U'
b'\x01'
b'\n'
b'\x01'
b' '
b'\x03'
b'\xb8'
b'\x0b'
b'\x0d'
[previous ten lines repeated indefinitely]