1

I have a basic PHP script that accept a file send by the user. This code is just a simple API , the user is sending us file through POST request.

I was wondering how could I make this handle over 3000 users sending files at the same time? Do I need to use threading ? Whats the best solution?

On user website : (www.example.com)

 <form action="http://www.mywebsite.com/file.php" method="post" enctype="multipart/form-data">
Your Photo: <input type="file" name="photo" size="25" />
<input type="submit" name="submit" value="Submit" />
  </form>

Here is code on my server(mywebsite.com) (file.php):

       //if they DID upload a file...
    if($_REQUEST['photo']['name'])
     {
//if no errors...
if(!$_REQUEST['photo']['error'])
{
    //now is the time to modify the future file name and validate the file
    $new_file_name = strtolower($_REQUEST['photo']['tmp_name']); //rename file
    if($_REQUEST['photo']['size'] > (1024000)) //can't be larger than 1 MB
    {
        $valid_file = false;
        $message = 'Oops!  Your file\'s size is to large.';
    }

    //if the file has passed the test
    if($valid_file)
    {
        //move it to where we want it to be
        move_uploaded_file($_REQUEST['photo']['tmp_name'], 'uploads/'.$new_file_name);
        $message = 'Congratulations!  Your file was accepted.';
    }
}
//if there is an error...
else
{
    //set that to be the returned message
    $message = 'Ooops!  Your upload triggered the following error:  '.$_REQUEST['photo']['error'];
}
    }

Thanks in advance

user3150060
  • 1,725
  • 7
  • 26
  • 46
  • 1
    Beware of security. The name is not being sanitized properly, for example I would be able to prefix the file with ../ and place the file (which ever I want) anywhere I have permission. You should take a look at your php settings, memory settings, webserver choice and setup is essenstial along with hardware choice. – Ronni Skansing May 08 '14 at 17:34
  • Yes I see your point , Thanks Ronni I can sanitized it later , I need to solve the problem of multi users sending my files at the same time , how can I handle that? – user3150060 May 08 '14 at 17:38
  • You need to perform stress tests to see how many users you can handle at a time and plan how you handle peak (either locally or via 3rd party , there lots of services for stuff like this) values. Might be you can already handle the number of users you want (or not), but anyways tests gives you better metrics. You could also choose a hosting provider that auto scales for you. – Ronni Skansing May 08 '14 at 17:39
  • Yes I think you aare right , I need to do stress test but how can I do that? do you have some references for me – user3150060 May 08 '14 at 17:43
  • 1
    http://stackoverflow.com/questions/7492/performing-a-stress-test-on-web-application?rq=1 or one of the associated links. Another choice would be some cloud hosting that scales. Lastly there alot of services that does the stress testing (3rd party) like http://loader.io/ – Ronni Skansing May 08 '14 at 17:44
  • Great, I am glad if it helped you abit further. You can accept @TimeSheep's answer or add you own answer (in time) when you solved this challange, by any solution that makes sense in your questions context. – Ronni Skansing May 08 '14 at 17:55

1 Answers1

0

You don't need to employ threading in your PHP script, as that usually doesn't make sense on a website. Depending on the webserver, it will make sure the load is balanced across system resources.

Everytime someone accesses the API, a new php process is spawned, thus several users will be able to use it at the same time because the webserver will take care of the threading.

As mentioned in Ronni's comment, you might also want to spend a while securing the API against attacks which will effectively allow a user to overwrite one of your files. A good way to solve this issue is by simply giving the file a random name instead of the original one.


There are several techniques to optimizing your service. Some of the things you should pay attention to, is the actual webserver (nginx, apache, lighttpd, IIS etc.) and they way the PHP files are handled (e.g. mod_php, php_fcgi, php-fpm).

Since you will be dealing with file transfers you can expect a lot of network traffic and I/O you might want to look into servers on gigabit connections and in extreme cases solid state drives or large RAID configurations. Getting a VPS or using a cloud instance is usually a good idea because the services are easily scalable, getting new hardware on a VPS only takes a reboot, wheras a dedicated server may be down for 15-30 minutes while the hardware is swapped or you might even have to move everything to another machine.

On the even more extreme you can look into purchasing IP load balancers to balance the load around a cluster, but this might not be relevant for that amount of simultaneous transfers.

If you have any specific questions with regards to server software or hardware, you can create a new question on https://serverfault.com/ which is the Stack Overflow equivalent for server management.

As for stress testing, check out some of the tools on this wikipedia page.

Community
  • 1
  • 1
Steen Schütt
  • 1,355
  • 1
  • 17
  • 31