2

I'm working on a project at the moment where I'm required to download images from an ftp server in order to put them on a website.

The path to the images is located in a txt file which is zipped on the server. The path for each image looks like this:

pc33c1_26126_08_hd.jpg /photos/pvo/transfertvo/photos/pc33c1/pc33c1_x48_08_hd.jpg 1a71ffb90de7628b8b1585b7e85c68e7

The way to get the photos should look like this:

get /photos/pvo/transfertvo/photos/pc33c1/pc33c1_x48_08_hd.jpg pc33c1_26126_08_hd.jpg

This example was provided by the company that's hosting the images.

When I connect to the ftp using PowerShell, there are two directories available - one where the zip file is located, and another one that appears to be empty. When you look at the paths in .txt file, you can clearly see that the pictures are actually located inside the second directory that appears to be empty. When you use the get command in the PowerShell, you can download the picture, but only one by one (and there are hundreds of pictures that need to be downloaded).

I've tried writing a PowerShell script that goes through the .txt file and "takes" the URL needed to find the image. First I need to connect to the FTP, but the following code doesn't work:

$FTPServer = "ftp.example.com"
$FTPUsername = "user"
$FTPPassword = "pass"
$credential = New-Object System.Net.NetworkCredential($FTPUsername, $FTPPassword)

$FTP = [System.Net.FtpWebRequest]::Create($FTPServer)

$FTP.Credentials=$credential
$FTP.UseBinary = 1
$FTP.KeepAlive = 0

The second piece of code is supposed to go through the .txt file once I download it from the FTP but I'm unable to connect to the server to do that in the first place.

$f = Get-Content "photos.txt"
$i = 0

foreach ($line in $f) {

    $fields = $line.Split("`t")

    $First = $fields[0]
    $Second = $fields[1]
    $Third = $fields[2]
    $number = $i++

    $url = $Second +" "+ $First 

    write-host "Title is: "$First
    write-host "Path is: "$Second
    write-host "Hash is: "$Third
    write-host "Number is: " $number
    Write-Host "This is the url: " $url
    get $url
    Write-Host ""
}

What this should do, it should go through the array and split each line to create the URL of the image with its respective name. But without the connection, it does nothing.

Is it possible to run a script inside PowerShell FTP to download all those images at once?

I have also tried writing a PHP script, but I have encountered many problems. Here's the code:

<?php

$ftp_server = 'ftp.example.com';
$ftp_user_name = 'user';
$ftp_user_pass = 'pass';
$remoteFilePath = '/data/photos.txt.zip';
$localFilePath = $_SERVER['DOCUMENT_ROOT']."/path/";

// set up basic connection
$conn_id = ftp_connect($ftp_server);

// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);

if ((!$conn_id) || (!$login_result)) {
    echo 'FTP connection has failed! Attempted to connect to '. $ftp_server. ' for user '.$ftp_user_name.'.';
}else{
    echo 'FTP connection was a success.<br>';
    $directory = ftp_nlist($conn_id,'');
    echo '<pre>'.print_r($directory,true).'</pre>';

    $contents = ftp_nlist($conn_id, "/data/");
    var_dump($contents);

    $bla = ftp_nlist($conn_id, "/photos/");
    var_dump($bla);

    ftp_pasv($conn_id, true);

    if (ftp_get($conn_id, $localFilePath.'/photos.txt.zip', $remoteFilePath, FTP_BINARY)) {

        echo "File has been downloaded!!";
        return true;

    } else {
        echo "fail ... ";
        echo "Connected has be stopped!!";
        return false;
    }
}
ftp_close($conn_id);

?>

This connects to the FTP, tests the connection and downloads the zip file on the local computer.

<?php

// assuming file.zip is in the same directory as the executing script.
$file = 'photos.txt.zip';

// get the absolute path to $file
$path = pathinfo(realpath($file), PATHINFO_DIRNAME);

$zip = new ZipArchive;
$res = $zip->open($file);
if ($res === TRUE) {
// extract it to the path we determined above
$zip->extractTo($path);
$zip->close();
echo " $file extracted to $path";
} else {
echo "Couldn't open $file";
}

?>

This one unzips the previously downloaded zip folder at the same location. The final piece of code, which attempts to take the path and download the images, doesn't work:

header("Content-Type: text/html; charset=UTF-8");    
$lines = file("photos.txt");
$dir = "local\\path\\to\\file\\";

foreach ($lines as $line) {
    $parts = explode("\t", $line);
    $something = $parts[1];
    $somethingelse = $parts[0];
    $var1 = $ftp_server.$something;

    file_put_contents($dir, file_get_contents($var1));
}

It successfully splits the array and gets the URL, but it doesn't download the images but instead gives me errors:

file_get_contents (...) failed to open stream: No such file or directory in...

file_get_contents (...) failed to open stream: Permission denied in...

I have tried changing all the permissions as proposed in some other topics, but to no avail.

I've also found a similar topic here, but the solution proposed was obsolete and not helpful at all.

I must say I'm pretty new to PowerShell and PHP, and I would appreciate any help that you can provide.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
Xullien
  • 363
  • 3
  • 8
  • Use cURL instead of file_get_contents and download each bit of the image and then dump it into a directory. Did a similar thing a few months ago but cURL is your answer ;) – RAZERZ Apr 09 '18 at 17:28
  • I tried to use the following code instead of file_get_contents line : $ch = curl_init($var1); $fp = fopen($dir, 'wb'); curl_setopt($ch, CURLOPT_FILE, $fp); curl_setopt($ch, CURLOPT_HEADER, 0); curl_exec($ch); curl_close($ch); fclose($fp); There are no errors but it doesn't do anything – Xullien Apr 09 '18 at 17:37
  • What does the `$url` look like? – Martin Prikryl Apr 09 '18 at 18:28
  • When the $url prints out it looks like this exactly like this /photos/pvo/transfertvo/photos/pc33c1/pc33c1_x48_08_hd.jpg pc33c1_26126_08_hd.jpg – Xullien Apr 09 '18 at 19:51
  • But that's **not a URL**! So if you can `get` that path, I do not see what makes you believe you should use FTP to retrieve the file, if you can get it directly. – Martin Prikryl Apr 10 '18 at 05:21
  • How can i get it directly if I need to be logged in the FTP? If I can run the first code after logging in FTP, shouldn't I be able to `get` all path ran by it? Or am I still missing what you are trying to tell me? @MartinPrikryl – Xullien Apr 10 '18 at 08:30
  • "If I can run the first code after logging in FTP, shouldn't I be able to get all path ran by it?" - That's vague and unclear. Sorry, but your question is incomprehensible. – Martin Prikryl Apr 10 '18 at 09:09
  • You don't understand my post. The company that is hosting the pictures gave me the instructions to connect to powershell and use the get command with the path to retrieve the pictures. I could do that, but only one by one which would take ages. The scripts that I wrote are giving me some errors and I asked for help with that. I believe the post was long enough with enough steps that I tried taking to be clear, full of details and comprehensible. – Xullien Apr 10 '18 at 09:39
  • Yes I do not understand your post, despite reading it at least three times. And I honestly believe it's not my fault. - For a start, you still didn't explain what does FTP to do with your question. + You describe that your first PowerShell script "downloads" files. But it does not **download** anything. It reads **local** files. And can go on like that paragraph by paragraph. – Martin Prikryl Apr 10 '18 at 10:57

2 Answers2

1

I ended up using PHP. Here is the working code :

1- Download the zip

$ftp_server = 'ftp.example.com';
$ftp_user_name = 'user';
$ftp_user_pass = 'psasword';
$remoteFilePath = '/data/photos.txt.zip';
$localFilePath = $_SERVER['DOCUMENT_ROOT']."/images";


// set up basic connection
$conn_id = ftp_connect($ftp_server);

// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);

if ((!$conn_id) || (!$login_result)) {
 echo 'FTP connection has failed! Attempted to connect to '. $ftp_server. ' for user '.$ftp_user_name.'.';
}else{
 echo 'FTP connection was a success.<br>';
 $directory = ftp_nlist($conn_id,'');
 echo '<pre>'.print_r($directory,true).'</pre>';

$contents = ftp_nlist($conn_id, "/data/");
var_dump($contents);

$photos = ftp_nlist($conn_id, "/photos/");
  var_dump($photos);


ftp_pasv($conn_id, true);

if (ftp_get($conn_id, $localFilePath.'/ddda-photos.txt.zip', $remoteFilePath, FTP_BINARY)) {

    echo "File has been downloaded!!";
    return true;

} else {
    echo "fail ... ";
    echo "Connected has be stopped!!";
    return false;

 }

}

2- Unzip the file

// assuming file.zip is in the same directory as the executing script.
$file = 'photos.txt.zip';

// get the absolute path to $file
$path = pathinfo(realpath($file), PATHINFO_DIRNAME);

$zip = new ZipArchive;
$res = $zip->open($file);
if ($res === TRUE) {
// extract it to the path we determined above
$zip->extractTo($path);
$zip->close();
echo "$file extracted to $path";
} else {
echo "Couldn't open $file";
}

3- Read the text file and download images

//So it doesn't time out
set_time_limit(0);

$ftp_server = 'ftp.example.com';
$ftp_user_name = 'user';
$ftp_user_pass = 'password';

$lines = file("photos.txt");
$dir = "F://test/";

//Goes through the file and separates name path and hash
foreach ($lines as $line) {
    $parts = explode("\t", $line);
    $imagePath = $parts[1];
    $imageName = $parts[0];
    $var1 = $ftp_server.trim($imagePath,'/');

    $ftpUrl = 'ftp://'.$ftp_user_name.':'.$ftp_user_pass.'@'.$ftp_server.'/'.$imagePath;

    // ftp url dump to be sure that something is happening
    var_dump($ftpUrl);

    $curl = curl_init();
    $fh   = fopen($dir.$imageName, 'w');
    curl_setopt($curl, CURLOPT_URL, $ftpUrl);
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
    $result = curl_exec($curl);

    fwrite($fh, $result);
    fclose($fh);
    curl_close($curl);
}

The problem was that for each request cURL had to pass the user and password in order to download the pictures.

Hope this helps someone with the same problem.

Xullien
  • 363
  • 3
  • 8
0

You seem to expect that after "connecting" to FTP using FtpWebRequest, the get command would somehow magically fetch files from FTP server.

That assumes so many thing that are simply not true that it's hard to guess where to start explaining.

  • There's no get in PowerShell. That's a common command in FTP clients.
  • FtpWebRequest cannot just connect. The class is designed to do one-off jobs, like downloading a files, etc. Not to create a persistent connection.
  • etc, etc, etc.

Basically, you need to download the list file using WebRequest (FtpWebRequest) and then repeat the same for each file:

$credentials = New-Object System.Net.NetworkCredential("username", "password") 

$baseUrl = "ftp://ftp.example.com"
$listPath = "/remote/path/list.txt"
$listUrl = $baseUrl + $listPath

Write-Host "Retrieving file list from $listPath..."

$listRequest = [Net.WebRequest]::Create($listUrl)
$listRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
$listRequest.Credentials = $credentials

$lines = New-Object System.Collections.ArrayList

$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
    $line = $listReader.ReadLine()
    $lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()

foreach ($line in $lines)
{
    $fields = $line.Split("`t")
    $file = $fields[0]
    $remoteFilePath = $fields[1]

    Write-Host "Downloading $remoteFilePath..."
    $fileUrl = $baseUrl + $remoteFilePath
    $localFilePath = $file

    $downloadRequest = [Net.WebRequest]::Create($fileUrl)
    $downloadRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
    $downloadRequest.Credentials = $credentials

    $downloadResponse = $downloadRequest.GetResponse()
    $sourceStream = $downloadResponse.GetResponseStream()
    $targetStream = [System.IO.File]::Create($localFilePath)

    $sourceStream.CopyTo($targetStream)

    $targetStream.Dispose()
    $sourceStream.Dispose()
    $downloadResponse.Dispose()
}

The code is basically same as my answer to PowerShell FTP download files and subfolders, except that it retrieves list of files from a file, instead of from a directory listing.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
  • Sorry for the delayed answer, your code is functional for the txt file on the server but since the file is zipped, I ended up using PHP which worked as well. I'll post it here as an other solution. Thank you for your time. – Xullien Apr 11 '18 at 10:23