We have a file, sent from HTML form to CGI (perl) script. We know, that files came from HTML via CGI.PM are paged onto disk (as an alternative for RAM). And in this case I have a question: is it possible to read a file by chunks from that memory into CGI script (say, into array)? The code we have as simple as possible:
#!/usr/bin/perl
use strict;
use warnings;
use CGI;
use CGI::Carp qw ( fatalsToBrowser );
my $q = new CGI;
my @file = $q->upload("file"); # here I think must be some while(<>) {} for reading from N byte to X byte.
And if it is possible then I have a second sub-question: is it possible to read different chunks (say: fro the beginning from 0 to 500 bytes, then from 1000 to 1500, then from 501 to 999 bytes? Thanks!