1

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)

PS: I did research before asking.

Jürgen Paul
  • 14,299
  • 26
  • 93
  • 133
  • 1
    Maybe this suits your use case http://stackoverflow.com/questions/4635936/super-fast-getimagesize-in-php – Mike Sep 03 '12 at 14:51

3 Answers3

2

You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.

So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.

You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.

deceze
  • 510,633
  • 85
  • 743
  • 889
2

Here is an example, you need to make some change to fit your requirement.

function getimagesize_limit($url, $limit)
{
 global $phpbb_root_path;
 $tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
  $fp = fopen($url, 'r');
 if (!$fp) return false; 
 $tmpfile = fopen($tmpfilename, 'w');
  $size = 0;
 while (!feof($fp) && $size<$limit)
 {
  $content = fread($fp, 8192);
  $size += 8192;  fwrite($tmpfile, $content);
 }
  fclose($fp);
 fclose($tmpfile);
  $is = getimagesize($tmpfilename);
 unlink($tmpfilename);
 return $is;
}
xdazz
  • 158,678
  • 38
  • 247
  • 274
2

You can download the file separately, imposing a maximum size you wish to download:

function mygetimagesize($url, $max_size = -1)
{
        // create temporary file to store data from $url
        if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
                return false;
        }
        // open input and output
        if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
                unlink($tmpfname);
                return false;
        }
        // copy at most $max_size bytes
        stream_copy_to_stream($in, $out, $max_size);

        // close input and output file
        fclose($in); fclose($out);

        // retrieve image information
        $info = getimagesize($tmpfname);

        // get rid of temporary file
        unlink($tmpfname);

        return $info;
}
Ja͢ck
  • 170,779
  • 38
  • 263
  • 309
  • Hi, wouldn't I have memory issues with `stream_copy_to_stream` if I have like 100 users uploading simultaneously? – Jürgen Paul Sep 03 '12 at 15:41
  • @Severus how would you imagine that? Internally it uses 8K buffers to perform the copy I believe. – Ja͢ck Sep 03 '12 at 15:45
  • @Severus could you be more specific? :) – Ja͢ck Sep 04 '12 at 11:33
  • so how will I throw an exception when the limit is reached? – Jürgen Paul Sep 04 '12 at 11:38
  • @Severus stream_copy_to_stream() returns the bytes copied so you could check if the number equals $max_size – Ja͢ck Sep 04 '12 at 11:49
  • Correct me if I'm wrong, but this function *limits* to the maximum filesize? I tried using it to an image and it missed a few bytes, the quality of the image degraded. – Jürgen Paul Sep 04 '12 at 13:10
  • @Severus it copies at most $max_size bytes, so if the result of the copy operation equals that number you should not continue. – Ja͢ck Sep 04 '12 at 13:34
  • @Jack: Nice answer, check out also `file_put_contents`, might make it even a bit more compact. – hakre Jan 01 '13 at 20:39
  • @hakre no, that's not possible because file_put_contents() requires the whole data to write iirc. – Ja͢ck Jan 02 '13 at 13:04