here is a shell script which takes domain and its parameters to find status code . this runs way faster due to threading but misses lot of requests.
while IFS= read -r url <&3; do
while IFS= read -r uri <&4; do
urlstatus=$(curl -o /dev/null --insecure --silent --head --write-out '%{http_code}' "${url}""${uri}" --max-time 5 ) &&
echo "$url $urlstatus $uri" >> urlstatus.txt &
done 4<uri.txt
done 3<url.txt
if i ran normally it process all requests but the speed is very low. is there a way through which speed is maintained and it also not misses all requests .