1

I have a website developed under CakePHP. I'm trying to change the download system from Apache to FTP.

I currently am doing the FTP by hand with PHP (not using any plugins or libraries).

My code succesfully connects to the FTP server etc. The problem I'm having is that when I call ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY), it succesfully executes, though the file does not download, instead, its moved to the /webroot/ folder, which serves, as the name suggests, as the root folder of the website (using .httaccess rules).

I mention the .httaccess because I may suspect this is what causing the FTP to route the download to "moving" it to the root, but I'm not sure.

The following is my FTP download code:

    $conn_id = ftp_connect($host);
    $login_result = ftp_login($conn_id,$ftp_user,$ftp_pass);

     if($login_result)
     {
         try 
         {
             if(ftp_get($conn_id, $localFile, $remoteFile, FTP_BINARY))
             {
                 $this->log("Downloaded...","debug");
             }
             else
             {
                 $this->log("Couldn't download...","debug");
             }
         }
         catch(Exception $e)
         {
             $this->log("Error: " . $e->getMessage(),"debug");
         }
     }
     else
     {
         $this->log("Coudln't connect to FTP server...","debug");
     }

     ftp_close($conn_id);

Yes, I checked (printed out) the $conn_id and the $login_result and they are working.

What is inside the paths?

 $remoteFile = "/downloads/rdticoiroe1432584529/YourMusic.zip";
 $localFile = "Music.zip";

The code does not throw any errors. I also tried using the fotografde/cakephp-ftp repo plugin for cakephp and FTP, and it does the same behaviour...

EDIT 1:

Its a music download site, right now we serve file downloads with Apache (which is very slow), the idea is to move to FTP downloads. I'm trying to use FTP protocol to download the file when the client requests download of it.

EDIT 2:

The whole idea of this question, is me trying to move to a more optimized method to serve files to clients. I own a 100Mbit transfer server and downloads are preeetty slow. I'm using Apache at the moment to download files to clients who request it.

I completely misunderstood about using PHP-FTP to serve files to the clients Hard Drive.

So, I'm looking for some guidance at what methods/protocols do people use when they serve files to clients who request download. (This is a music download site).

nullwriter
  • 785
  • 9
  • 34
  • htaccess cannot affect ftp, since ftp won't be using http (no webserver, no .htaccess) to do the transfers. – Marc B May 25 '15 at 20:28
  • 1
    `$localFile` is being set to path relative to your script. Use a full `/path/to/target/location` – CrayonViolent May 25 '15 at 20:28
  • How would I get a full path of where the file should be downloaded on clients side? – nullwriter May 25 '15 at 20:29
  • 2
    `ftp_get` gets the file from the remote server and puts it on *your* server (the server the script is running on), not the client's computer. Since you are specifying a relative path in `$localFile`, it's putting it relative to your running script on your server. So, if you want it put somewhere else (on your sever), you need to specify an absolute path, or use e.g. `../' to move up a dir, etc. – CrayonViolent May 25 '15 at 20:31
  • @CrayonViolent The idea is to serve files via FTP to clients, instead of the slow Apache. I read ftp_get is what you use for this...? – nullwriter May 25 '15 at 20:32
  • no. If you are basically wanting to be a middleman (proxy) then [use curl](http://stackoverflow.com/questions/5830504/ftp-download-file-from-server-directly-into-client) or [a socket](http://stackoverflow.com/questions/1397182/stream-ftp-download-to-output) – CrayonViolent May 25 '15 at 20:34
  • Ooohh...wow.. Ive been in the wrong understanding =/... which one would you advise is the fastest method, cURL or using a socket? – nullwriter May 25 '15 at 20:37
  • you may possibly be able to do what you have now and use ["php://output"](http://php.net/manual/en/wrappers.php.php) as the `$localFile` value but.. I've never tried that – CrayonViolent May 25 '15 at 20:38
  • 1
    What makes you think you can serve a file to the client faster than Apache can? – IMSoP May 25 '15 at 20:39
  • I don't know which solution is best (I'm not an expert) but my money is on sockets. – CrayonViolent May 25 '15 at 20:40
  • @IMSoP I've been advised by my webhost that serving files via Apache is a much slower method – nullwriter May 25 '15 at 20:47
  • 1
    @ChristianFeo Much slower than what? What question did you ask them? Because I think they may have misunderstood. – IMSoP May 25 '15 at 20:50
  • I agree w/ @IMSoP . I bet your webhost meant something like "..as opposed to giving users ftp access to your site" – CrayonViolent May 25 '15 at 20:51
  • @CrayonViolent users only access is a pretty "Download" button for them to request the download. Thats from where it will work. I'm looking for a way of optimizing the method to serve files, giving FTP access surely is not an option... – nullwriter May 25 '15 at 20:56
  • "I own a 100Mbit transfer server" - 100Mbits from where to where? Shared with what other infrastructure? Connected to what upstream providers / backbones? Located how close in geography and network topology to the clients downloading the music? All of these things are likely to have much more impact than the software used to serve the data. – IMSoP May 25 '15 at 21:10
  • You're right... do you think software optimization has little to do with this? For example, mega.nz gives us amazing download speeds (and theyre located in New Zealand, we are in Venezuela)...and our current server is in Texas, and speeds are...super slow (it could also be our shared VPS, but speaking location wise). – nullwriter May 25 '15 at 21:22
  • 1
    Yeah, all the software has to do is push a load of bytes (the file contents) onto a wire; there's really not much to be optimised. Once it's on that wire, though, it's got to make its way through a whole maze of connections, many of which are congested with loads of other traffic, before it reaches the client's PC. BTW, don't assume the location of a server based on a domain name - large sites will store copies of data on servers all around the world, and automatically connect users to whichever server is closest to them, with no indication in the URL at all. – IMSoP May 25 '15 at 21:36

1 Answers1

4

You have a fundamental misunderstanding here: the "remote file" in an FTP connection is something remote from you - a file on another server somewhere, for instance. In your case, I think the file you're trying to serve is "local" in this sense, but you want a way of getting it to a client quicker.

There is no way of pushing a file out to a client's hard drive (think of the security implications), you can only make it available. Right now, the client can request it using HTTP, and Apache will give them the content. If you installed an FTP server, they could connect with an FTP client and request it that way. Similarly, you could install a different web server, like nginx or lighttpd, which might be faster.

Nothing you do in PHP, however, will be faster than just Apache, because you still have Apache involved, but now you have your PHP code as well, which at best requires the time to execute your code, before doing what Apache was going to do anyway.

IMSoP
  • 89,526
  • 13
  • 117
  • 169
  • Thanks for the clarification. I'm trying to optimize the download speeds, right now, even having a 100MBit transfer server does not work well, DL speeds are slow, webserver has advised to move from Apache downloads. What technology/method should I be studying to implement? – nullwriter May 25 '15 at 20:53
  • 1
    To be honest, the bottleneck is likely to be your hosting provider, so they're probably trying to cover their backs with dubious advice. Apache may not be the fastest HTTP server at serving files (look up lighttpd, as I mentioned) but the difference is likely to be dwarfed by low bandwidth in the various connections between there and the client's computer. – IMSoP May 25 '15 at 21:01
  • Well, at least I know where to look now, still a bit dubious about the whole deal... I had this idea with FTP will solve my problem. It's been hard to find some guidance around these topics. I'll have a look at lighttpd. I'll mark this as answer later on today.... thanks man! – nullwriter May 25 '15 at 21:08