1

My problem is the following: I scripted a workaround to automatise the download of program files containing version numbers in its file names like programv1.55.50.zip (http or https sites). The workaround is coming from linear rising version numbers (predecessor is version 1.55.49, actual is 1.55.50 and successor "maybe" 1.55.51) I used excel to build rows 1.55.50 to 1.57.99 and joined them with the download link. As result I get per example:

https:\linktofile\programv1.55.49.zip, https:\linktofile\programv1.55.50.zip .. https:\linktofile\programv1.57.99.zip.

All links I pasted into a linkstoprogram.txt file, one row per link for the use with switch -i of wget.

Using

wget --timestamping --referer foo --recursive --no-parent --input-file=C:\Path\linkstoprogram.txt --show-progress --append-output=C:\Path\%Timestamp%_program.log

on all that links I get downloaded the two files programv1.55.49.zip and the actual programv1.55.50.zip and massive error logs for all the others non available "dummy" files. Thats why I want to separate the log of this queries. The next procedure is sorting them by date and deleting all files exempt the newest one and copy it.

FOR /F "skip=1 eol=: delims=" %%G IN ('dir /b /o-d *.zip') DO SDelete64 -p 3 -r -nobanner "%%G"

copy programv<.zip program.zip

Question for use in Windows batch, I also use the newest CoreUtils for Windows in batch:

So how can I examine the returned HTTP status codes by checking 200,302,403, and 404 values if they pass or fail?

If all status codes passes append the complete download link to a within this procedure created text file, if one or more status codes fail do nothing.

wget -q --spider address 

seem to give out errorlevel only for status code 200. So all errorlevel seem to be 0. I found some reference about this context but do not get further on:

Here answer @DrFloyd5, here answer @Jim Davis, and here answer @Stuart Siegler for Windows.

This Bash-scripts (especially answer @jm666) gets near to what I want, but I cannot reproduce it in batch and need as result a text file with the passing download links, not the error codes on the links!

#!/bin/bash
while read LINE; do
  curl -o /dev/null --silent --head --write-out '%{http_code}\n' "$LINE"
done < url-list.txt

and

dosomething() {
        code="$1"; url="$2"
        case "$code" in
                200) echo "OK for $url";;
                302) echo "redir for $url";;
                404) echo "notfound for $url";;
                *) echo "other $code for $url";;
        esac
}

#MAIN program
while read url
do
        uri=($(echo "$url" | sed 's~http://\([^/][^/]*\)\(.*\)~\1 \2~'))
        HOST=${uri[0]:=localhost}
        FILE=${uri[1]:=/}
        exec {SOCKET}<>/dev/tcp/$HOST/80
        echo -ne "GET $FILE HTTP/1.1\nHost: $HOST\n\n" >&${SOCKET}
        res=($(<&${SOCKET} sed '/^.$/,$d' | grep '^HTTP'))
        dosomething ${res[1]} "$url"
done << EOF
http://stackoverflow.com
http://stackoverflow.com/some/bad/url
EOF

OR

Does there exist a solution to use the wget -A switch so that I do not need the workaround at all?

I am interested in both solutions!

dimarco76
  • 23
  • 3

1 Answers1

0

Suppose that you have this text file linkstoprogram.txt with a Batch file inside the same folder :


linkstoprogram.txt

https://www.google.tn/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png
https://linktofile/programv1.55.49.zip
https://linktofile/programv1.55.50.zip 
https://linktofile/programv1.57.99.zip
https://www.google.com
https://www.stackoverflow.com
https://downloads.malwarebytes.com/file/mb3/
https://download.toolslib.net/download/file/1/1388

Batch Code : Get_Status_Codes_Curl.bat

@echo off
Title Example Batch Script to Get Status Code from cURL
Set "InputFile=%~dp0linkstoprogram.txt"
Set "LogFile=%0.txt"
If Exist "%LogFile%" Del "%LogFile%"
@for /f "delims=" %%a in ('Type "%InputFile%"') do (
    echo %%a & curl -s -o -I -w "%%{http_code}\n" %%a
    >>"%LogFile%" (echo %%a & curl -s -o -I -w "%%{http_code}\n" %%a)
)
If Exist "%LogFile%" Start /MAX "Log" "%LogFile%"
Hackoo
  • 18,337
  • 3
  • 40
  • 70
  • Thank you so much for your answer. I checked it out and got 302 for the two existant downloads. But maybe the server has problems right now. Will try tomorrow again and give feedback. – dimarco76 Sep 16 '21 at 20:39