-1

I'm using phpseclib to connect an sftp server, and I use $sftp->nlist to list the directories. I can list all small directories, but when listing one with more than 8000 files the memory limit is exceeded. I've already set ini_set('memory_limit','128M') and ini_set('max_execution_time', '300'); but it is impossible to wait for the response of the request. Is there any way to recover this array little by little and free up the memory? Sorry for bad English

ini_set("display_errors",true);
ini_set('memory_limit', '128M'); 
ini_set('max_execution_time', '300');
set_include_path("/var/www/cremesp.com/_class/API/UnnamedOpenSourceGroup/phpseclib/");
include_once("Net/SFTP.php");

$ftp_server = "*****";
$ftp_username = "*****";
$ftp_password = "*****";
$sftp = new Net_SFTP($ftp_server, *****);
$sftp->login($ftp_username, $ftp_password);

$files_proc = $sftp->nlist("/PROC/");

as I said, I can list other directories, but / PROC is the largest and takes about 150 seconds to respond

  • 4
    show the code... – user1597430 Jun 08 '21 at 03:00
  • How long does the `$sftp->nlist` really take? – Martin Prikryl Jun 08 '21 at 04:56
  • What have you tried to resolve the problem? Is this maybe a networking issue between the systems? – Nico Haase Jun 08 '21 at 12:38
  • 1
    Are you sure it's the `nlist` that causes the problem? Isn't it rather some processing of the files later in your script? 8000 is not that much. I cannot imagine that it would cause problems with `nlist`, unless you have some ridiculously small memory limit. – Martin Prikryl Jun 08 '21 at 13:48
  • I'm pretty sure so, because as I said, when listing smaller directories, the feedback is instantaneous. In addition, I can list the "/PROC", the problem is the delay that is unfeasible in production – Rayk Rocha Jun 08 '21 at 15:58
  • A *"feedback"* of what? How exactly did you determine that the problem is in the `nlist`? – Martin Prikryl Jun 08 '21 at 16:50
  • not exactly in `nlist`, but in the way I implement it. I just wanted to know if there's any way I can get back to the directory list bit by bit – Rayk Rocha Jun 08 '21 at 18:02
  • What do you mean by *"the way you implement it"*? So either you want to ask how to retrieve the listing in parts. Or you want to ask about your memory problem, which seems to be related to some code you actually did not share with us. This more and more looks like an [XY problem](https://meta.stackexchange.com/q/66377/218578). We need [mcve]. – Martin Prikryl Jun 08 '21 at 19:19
  • https://github.com/phpseclib/phpseclib/pull/1418 would help with this but that has not yet been implemented. Maybe try posting in that ticket asking about it? Mind you, in reading that ticket, I do get the impression that that might not be included even in 3.0.x. Maybe in dev-master but dev-master is not supposed to have a stable API. eg. BC breaking changes could be made from one commit to the next. – neubert Jun 09 '21 at 04:34

1 Answers1

-1

Use pagination to scan the directory and list the filenames with offset:

Refer: https://www.php.net/scandir

$limit = 1000;
$offset = 0;

$dir = scandir($path);

for ($i = $offset; $i < $limit; $i++) {
    echo $dir[$i] . "<br />";
}

Use some pagination logic to increment the offset.

or

you can update your memory_limit to 0:

ini_set("memory_limit", "-1");
set_time_limit(0);

Refer this: PHP - read huge amount of files from directory

The above has the answer using RecursiveDirectoryIterator which may be helpful

  • 1
    What takes time is the `scandir` (or its SFTP equivalent in OP's question). Limiting the number of printed entries won't have much effect on the overall execution time. – Martin Prikryl Jun 08 '21 at 04:55
  • I have updated the answer with link which has answer https://stackoverflow.com/questions/46489228/php-read-huge-amount-of-files-from-directory – Ashok Kumar Thangaraj Jun 08 '21 at 12:35
  • I'm not sure it helps. You cannot implement paging using the iterator, without iterating all previous files. So (if the iterator is implemented efficiently), it might help when you use it to list first few "pages". But once you want to iterate the last pages, you are back at the original problem. – Martin Prikryl Jun 08 '21 at 13:19
  • Not to mention that the question is about SFTP, not local files. – Martin Prikryl Jun 08 '21 at 13:44