Using Perl WWW::Mechanize in Perl. This module has numerous methods that can perform web browser like functions. Below is a Sample code:
use WWW::Mechanize;
use strict;
my $username = "admin";
my $password = "welcome1";
my $outpath = "/home/data/output";
my $fromday = 7;
my $url = "https://www.myreports.com/tax_report.php";
my $name = "tax_report";
my $outfile = "$outpath/$name.html";
my $mech = WWW::Mechanize->new(noproxy =>'0');
$mech->get($url);
$mech->field(login => "$username");
$mech->field(passwd => "$password");
$mech->add_handler("request_send", sub { shift->dump; return });
$mech->add_handler("response_done", sub { shift->dump; return });
$mech->click_button(value=>"Login now");
my $response = $mech->content();
print "Generating report: $name...\n";
open (OUT, ">>$outfile")|| die "Cannot create report file $outfile";
print OUT "$response";
close OUT;
In-case you want to handle Javascripts in the web-page (which you want to scrape), you can have a look at WWW::Mechanize::Firefox, but this may require installing the MozRepl plugin for Mozilla.