1

I want to allow my users to upload a file by providing a URL to the image.

Pretty much like imgur, you enter http://something.com/image.png and the script downloads the file, then keeps it on the server and publishes it.

I tried using file_get_contents() and getimagesize(). But I'm thinking there would be problems:

  1. how can I protect the script from 100 users supplying 100 URLs to large images?
  2. how can I determine if the download process will take or already takes too long?
Silviu-Marian
  • 10,565
  • 6
  • 50
  • 72

2 Answers2

1

This is actually interesting.

It appears that you can actually track and control the progress of a cURL transfer. See documentation on CURLOPT_NOPROGRESS, CURLOPT_PROGRESSFUNCTION and CURLOPT_WRITEFUNCTION

I found this example and changed it to:

<?php

file_put_contents('progress.txt', '');

$target_file_name = 'targetfile.zip';
$target_file = fopen($target_file_name, 'w');

$ch = curl_init('http://localhost/so/testfile2.zip');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_NOPROGRESS, FALSE);
curl_setopt($ch, CURLOPT_PROGRESSFUNCTION, 'progress_callback');
curl_setopt($ch, CURLOPT_WRITEFUNCTION, 'write_callback');
curl_exec($ch);
if ($target_file) {
    fclose($target_file);
}

$_download_size = 0;
function progress_callback($download_size, $downloaded_size, $upload_size, $uploaded_size) {
    global $_download_size;
    $_download_size = $download_size;
    static $previous_progress = 0;

    if ($download_size == 0) {
        $progress = 0;
    }
    else {
        $progress = round($downloaded_size * 100 / $download_size);
    }

    if ($progress > $previous_progress) {
        $previous_progress = $progress;
        $fp = fopen('progress.txt', 'a');
        fputs($fp, $progress .'% ('. $downloaded_size .'/'. $download_size .")\n");
        fclose($fp);
    }
}

function write_callback($ch, $data) {
    global $target_file_name;
    global $target_file;
    global $_download_size;

    if ($_download_size > 1000000) {
        return '';
    }
    return fwrite($target_file, $data);
}

write_callback checks whether the size of the data is greater than a specified limit. If it is, it returns an empty string that aborts the transfer. I tested this on 2 files with 80K and 33M, respectively, with a 1M limit. In your case, progress_callback is pointless beyond the second line, but I kept everything in there for debugging purposes.

One other way to get the size of the data is to do a HEAD request but I don't think that servers are required to send a Content-length header.

Alexei
  • 672
  • 1
  • 5
  • 13
  • No they're not. But your example looks flawless. Thanks. – Silviu-Marian Sep 04 '12 at 20:03
  • Glad I could help. You should change the title of your question to something more coherent such as "control curl file transfers in php" and maybe add a tag for "curl". – Alexei Sep 04 '12 at 20:51
0

To answer question one, you simply need to add the appropriate limits in your code. Define how many requests you want to accept in a given amount of time, track your requests in a database, and go from there. Also put a cap on file size.

For question two, you can set appropriate timeouts if you use cURL.

Community
  • 1
  • 1
Brad
  • 159,648
  • 54
  • 349
  • 530