I'm using perl-HTTP-Tiny-0.080 on fedora35 and trying to check for the status of a URL to determine the return code. My script runs fine until it comes across this particular URL with a PDF at sophos.com. The script just stalls and the get() or head() call with new() just never returns. I've also tried to set a timeout and it appears to be ignored.
use HTTP::Tiny;
use Net::FTP::Tiny qw(ftp_get);
my $url = "https://news.sophos.com/wp-content/uploads/2020/02/CloudSnooper_report.pdf";
my $response = HTTP::Tiny->new(timeout => 2)->get($url);
print "status: $response->{status} $url\n";
The print is just never reached. Using wget manually succeeds, while trying to set the agent to something other than "HTTP/Tiny" fails.
my $response = HTTP::Tiny->new(agent => "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.69 Safari/537.36")->get($url);
This code is part of a larger script that I'm using to check a series of URLs from a buffer to determine whether they're 404s and should be removed, or are still working links.
I'm unsure what further info I can provide.