0

I have an issue that i can't resolve. I have a script which copy small image files. The problem is that for one image variant it gets about 1.5 seconds. Is there anything that can get it faster? I'm using PHP CLI and my HDD is WD VelociRaptor 10K RPM. The source folder contains about 200K files

Here is the part of code i want to get faster:

        $startCopyVariant = time();

        $result = array('uploadedImagesUrls'=>array(), 'errMsg'=>'');

        // lazyload class instances
        $productOptionImages = &lloader()->getImagesByName("productOption");

        // validate image
        $imgSizeInfo = @getimagesize($srcImageInfo['tmp_name']);
        if (empty($imgSizeInfo)) { $result['errMsg'] = 'Invalid image '.$srcImageInfo['name'].', type '.$srcImageInfo['type']; return $result; }

        $ext = pathinfo($srcImageInfo['name'], PATHINFO_EXTENSION);

        $variantFileName = 'opt_'.$optionId."_variant.".$ext;
        $mainDestFileName = $variantFileName;
        $srcFileName = $this->getCropSizeFileName($srcImageInfo['name'], "big");
        copy($srcFileName, $productOptionImages->getImagePath().$variantFileName);

        $variantFileName = $variantFileName = 'opt_'.$optionId."_variant_sma.".$ext;;
        $srcFileName = $this->getCropSizeFileName($srcImageInfo['name'], "small");
        copy($srcFileName, $productOptionImages->getImagePath().$variantFileName);

        $variantFileName = 'opt_'.$optionId."_variant_thu.".$ext;;
        $srcFileName = $this->getCropSizeFileName($srcImageInfo['name'], "thumbnail");
        copy($srcFileName, $productOptionImages->getImagePath().$variantFileName);

        $variantFileName = 'opt_'.$optionId."_variant_tin.".$ext;;
        $srcFileName = $this->getCropSizeFileName($srcImageInfo['name'], "tiny");
        copy($srcFileName, $productOptionImages->getImagePath().$variantFileName);

        $endCopyVariant = time();

        $elapsedTime = $endCopyVariant - $startCopyVariant;
        print_r("Variant copy time:  (".$srcImageInfo['name']."): ".sprintf('%02d:%02d:%02d', ($elapsedTime/3600),($elapsedTime/60%60), $elapsedTime%60), 0);

Thanks.

EDIT: Here is the getCropsizeFileName does:

private function getCropSizeFileName($srcFileName, $size) {
        global $sourceCropBasePath;
        $ext = pathinfo($srcFileName, PATHINFO_EXTENSION);
        $destFileNamePrefix = basename($srcFileName, ".".$ext);
        return $sourceCropBasePath.$destFileNamePrefix."_".$size.".".$ext;
    }

The result of timers of each copy line are:

Variant copy time1:  (0a46de43f73304469a38137bf3f43c32.jpg): 00:00:02
Variant copy time2:  (0a46de43f73304469a38137bf3f43c32.jpg): 00:00:01
Variant copy time3:  (0a46de43f73304469a38137bf3f43c32.jpg): 00:00:02
Variant copy time4:  (0a46de43f73304469a38137bf3f43c32.jpg): 00:00:01
bksi
  • 1,606
  • 1
  • 23
  • 45
  • php is a serverside, i think this spees is normal.you can use operating system commands into php script. –  Aug 10 '13 at 06:33
  • But i have to copy about 1M images. This will takes months like that. – bksi Aug 10 '13 at 06:51
  • Which lines are the expensive ones? Can you put timers after each line? – Homer6 Aug 10 '13 at 07:15
  • What does the "getCropSizeFileName" method do? – Homer6 Aug 10 '13 at 07:17
  • Every line with copy gets about 1 sec :(. This is awfull :( – bksi Aug 10 '13 at 07:25
  • I noticed that the destination folder files count affects the copy performance. When files count reaches about 1M the copy to this folder via php becomes vary slow (about 1 file per second or two) – bksi Aug 10 '13 at 22:31
  • @bksi See my answer; it is consistent with the behaviour you're experiencing. It's not the destination folder but rather the aggregate file size of what you're copying. – texelate Oct 27 '15 at 15:18

3 Answers3

2

I found out today you can run across this on a shared server if the hosting company limits the I/O rate on the server. I went through this with my hosting company on Linux and the limit was 1 MB/s (to protect other accounts on the server). So, if you have 10 MB of files to copy you will add 10 seconds onto the script execution time if the rate is 1 MB/s.

There's nothing inherently wrong with PHP; I found that using exec and the cp command made no real difference to the speed.


I should add that if you need to routinely copy large files you should upgrade from a shared hosting package.

texelate
  • 2,460
  • 3
  • 24
  • 32
  • It is possible, but in my case i used own servers without limit. Also i experiensed this issue on both Linux and Widows platform, so it is not platform dependent. – bksi Oct 27 '15 at 15:59
  • Thanks, it's still worth noting though as this limit can produce the same problem. – texelate Oct 28 '15 at 07:12
  • Yes, this could be one reason for this behavior. I'm open for other suggestions too. In my case the problem were in the big amount of files in the target folder (> 1M). – bksi Oct 30 '15 at 19:56
0

Try profiling your script using something like xdebug. That way you can tell exactly what is causing any bottlenecks in your script.

Community
  • 1
  • 1
Mike
  • 23,542
  • 14
  • 76
  • 87
0

After many attempts i noticed that when the count of files in destination folder reaches 1M, the copy performance falls dramatically. May be there is something with NTFS. I disabled the windows search service, so it is not a reason. Also i tried it on Linux server with SSD with same result. It seems the amount of the files has influence of copy performance.

bksi
  • 1,606
  • 1
  • 23
  • 45