txt with more than 30000 records. All records are one for line and is an IP like this:
192.168.0.1
192.168.0.2
192.168.0.3
192.168.0.4
192.168.0.5
192.168.0.6
192.168.0.7
192.168.0.8
192.168.0.9
192.168.0.10
I read each row in a bash script, and I need to run a curl like this:
while IFS= read -r line || [[ -n "$line" ]]; do
#check_site "$line"
resp=$(curl -i -m1 http://$line 2>&1)
echo "$resp" | grep -Eo "$ok" > /dev/null
if [ $? -ne 0 ]; then
#echo -e "failed: $line" >> "${logfile}"
echo -e "Command: curl -i -m1 http://$line 2>&1" >> "${outfile}"
echo -e "failed: $line:\n\n \"$resp\"\n\n" >> "${outfile}"
echo "$line" >> "${faillog}"
fi
done < "${FILE}"
Is there a method to run multiple lines simultaneously in my file to reduce the execution time?