0

i need to backup a site's FTP. The site is hosted on a linux server. The problem is that there is a folder with more than 5k files. Linux can't show me more than 4998 files, so i cant copy these files 'cause the server don't give me more than 4998. I can't delete these files to see the others 'cause the site is actually online. I can't move these file in another directory for the same reason.

What can i do? i'm trying using a shell...but i don't know...i'm not sure using this method.

Tyler_
  • 139
  • 8
  • Is SSH an option? Can you copy all files to a second directory, download & delete from there? Which OS runs on your client? – Reeno Oct 07 '15 at 13:48
  • SSH is not an option its a web hosting and i can't have SSH access. I'm on Yosemite and i'm using Yummy FTP. I can copy ect. but i can copy only till 5k files for each folder. – Tyler_ Oct 07 '15 at 14:04

2 Answers2

1

I got solution for my own answer

 <?php
$rootPath = realpath('wp-content/uploads/2014/07');

// Initialize archive object
$zip = new ZipArchive();
$zip->open('dio.zip', ZipArchive::CREATE | ZipArchive::OVERWRITE);

// Create recursive directory iterator
/** @var SplFileInfo[] $files */
$files = new RecursiveIteratorIterator(
    new RecursiveDirectoryIterator($rootPath),
    RecursiveIteratorIterator::LEAVES_ONLY
);

foreach ($files as $name => $file)
{
    // Skip directories (they would be added automatically)
    if (!$file->isDir())
    {
        // Get real and relative path for current file
        $filePath = $file->getRealPath();
        $relativePath = substr($filePath, strlen($rootPath) + 1);

        // Add current file to archive
        $zip->addFile($filePath, $relativePath);
    }
}

// Zip archive will be created only after closing object
$zip->close();
Tyler_
  • 139
  • 8
0

You can do this through the command line, this guide shows you how. It seems mget (the ftp command) isn't recommended for recursive calls (subfolders and their content), so wget can also be used, se this.

I also like to zip such folders with many files into one, for easy oversight when up-and-downloading. Use

zip -r myfiles.zip myfiles

Here's a guide for that too.

Community
  • 1
  • 1
Eiriks
  • 479
  • 2
  • 11
  • 20