I wanna write a bash script to calculate the avg time of capture packets by the net interface. I find this answer that give me a hand, but I need to improve my script. So I extended it in this way
#!/bin/bash
INTERVAL="1" # interval in seconds
if [ -z "$1" ]; then
echo
echo usage: $0 [network-interface]
echo
echo e.g. $0 "$1"
echo
echo shows packets-per-second
exit
fi
N_PPS=0
SUM_TIME=0
calc_avg()
{
AVG=$((N_PPS/SUM_TIME))
echo
echo "The average of packet captured is $AVG pkts/s"
exit 0
}
trap calc_avg SIGINT
while true
do
R1=`cat /sys/class/net/$1/statistics/rx_packets`
sleep $INTERVAL
R2=`cat /sys/class/net/$1/statistics/rx_packets`
RXPPS=`expr $R2 - $R1`
SUM_TIME=$((SUM_TIME + 1))
if [ "$RXPPS" > 0 ];
then
N_PPS=$((N_PPS + RXPPS))
fi
echo "RX $1: $RXPPS pkts/s"
done
As I explain I need to have the AVG time of the packets captured, so i count my packet with N_PPS
.
I need to understand better if this calculation is correct, both for SUM_TIME
and AVG_TIME
, and how to have microseconds instead seconds, in bash.
Any advice or critic are well accepted.
EDIT AFTER CHANGES
I did changes thanks to @jarr answer but I have the problem, if the result is less han 1, not showing the inizial
0
. I.e.AVG = .333
AVG=$(echo "scale=3; $N_PPS/$SUM_TIME" | bc )
I notice that when I launch this script, it creates a file called 0 of 0 bytes. Why and how to avoid this ?