That is quite a few threads to launch. It's probably below the thread limit for your system, so it depends how many resources you have available for the job.
If you'd rather use a worker pool, Parallel:ForkManager is a popular module for that.
The module's documentation offers this example for a mass-downloader:
use LWP::Simple;
use Parallel::ForkManager;
...
@links=(
["http://www.foo.bar/rulez.data","rulez_data.txt"],
["http://new.host/more_data.doc","more_data.doc"],
...
);
...
# Max 30 processes for parallel download
my $pm = Parallel::ForkManager->new(30);
foreach my $linkarray (@links) {
$pm->start and next; # do the fork
my ($link,$fn) = @$linkarray;
warn "Cannot get $fn from $link"
if getstore($link,$fn) != RC_OK;
$pm->finish; # do the exit in the child process
}
$pm->wait_all_children;
LWP::UserAgent doesn't have the same getstore
sub that LWP::Simple provides, but it does have a mirror
method which behaves similarly.