111

I have small project made in symfony2 when I try to build it on my server it's always fails when unzipping symfony. Build was OK and suddenly composer won't unzip symfony and I didn't change anything. I tried to build with Jenkins and also manually from bash with same result. It's not permissions problem and also internet connection on my server is OK.

Loading composer repositories with package information
Installing dependencies (including require-dev) from lock file
 - Installing symfony/symfony (v2.3.4)
 Downloading: 100%
[Symfony\Component\Process\Exception\ProcessTimedOutException]
The process "unzip '/path/vendor/symfony/symfony/6116f6f3
d4125a757858954cb107e64b' -d 'vendor/composer/b2f33269' && chmod -R u+w 'vendor/composer/b2f33269'" exceeded the timeout of 300 seconds.
kenorb
  • 155,785
  • 88
  • 678
  • 743
zajca
  • 2,288
  • 4
  • 30
  • 40

12 Answers12

148

Check with composer update/install -o -vvv whether the package is being loaded from composers' cache.

If yes, try clearing composer's cache or try adding --cache-dir=/dev/null.

To force downloading an archive instead of cloning sources, use the --prefer-dist option in combination with --no-dev.

Otherwise you could try raising composer's process timeout value:

export COMPOSER_PROCESS_TIMEOUT=600 # default is 300
hakre
  • 193,403
  • 52
  • 435
  • 836
Nicolai Fröhlich
  • 51,330
  • 11
  • 126
  • 130
  • well package is written into cache. see pastebin, it's too large for comment http://pastebin.com/sb7deyNc same result with update command. Also extending time will not help I think it's 4-core machine and it's almost not working. – zajca Sep 20 '13 at 13:44
  • 3
    Using `php composer.php install --prefer-dist --no-dev` worked for me. – Rubens Mariuzzo Nov 04 '13 at 20:26
  • 3
    Works for me too. Does anyone know *why*? – hek2mgl Nov 12 '13 at 11:47
  • 2
    Well after while I figure out why this happened and the reason was slow NFS. I don't know why, since it's on local network, but I made a switch to sshfs and it's working without problem. – zajca Feb 20 '14 at 09:12
  • 7
    A common problem is NFS shares being slow when it comes to heavy disk i/o ... i.e. cache folders are affected by this. You can work around this by moving cache folders to `/dev/shm/`. In the case of composer you could use `--cache-dir=/dev/shm/composer/cache`. Read more about it in **[this article](http://www.whitewashing.de/2013/08/19/speedup_symfony2_on_vagrant_boxes.html)** that targets performance tricks for symfony2 with vagrant nfs shares. Glad you solved your issue though. You might still consider accepting my answer as 11 upvotes + the comments clearly state it is helpful for others. – Nicolai Fröhlich Feb 20 '14 at 11:38
  • I found that I couldn't use a `cache-dir` option when running composer (1.0-dev), neither via `--cache-dir` nor `-cache-dir`. Instead, I had to add it into my `composer.json` file: `"config": { "cache-dir": "/dev/shm/composer/cache" }`. – Sam Sep 03 '15 at 09:32
  • 1
    `export COMPOSER_PROCESS_TIMEOUT=600` ( defaults to 300 ) – itsazzad Feb 12 '16 at 17:19
90
composer config --global process-timeout 2000
Ali Motameni
  • 2,567
  • 3
  • 24
  • 34
83

The easiest method is add config option to composer.json file, Add process-timeout 0, That's all. It works anywhere.

{
  .....
  "scripts": {
    "start": "php -S 0.0.0.0:8080 -t public public/index.php"
  },
  "config": {
    "process-timeout":0
  }
}
X zheng
  • 1,731
  • 1
  • 17
  • 25
  • 14
    As of composer 1.9, you can also disable process timeout on a per script basis. eg. `"start": ["Composer\\Config::disableProcessTimeout","php -S 0.0.0.0:8080 -t public public/index.php"],` – ttk Sep 23 '19 at 16:04
  • 4
    This is what the answer from @Ali Motameni does for you, and what the comment under the answer does (by @morris4). They both actually change the corresponding composer.json file for you, altering this config value. One changes it in your global composer.json file, and the one from the comment changes it in the current project's composer.json file. – still_dreaming_1 Apr 21 '20 at 15:44
35

Composer itself impose a limit on how long it would allow for the remote git operation. A look at the Composer documentation confirms that the environment variable COMPOSER_PROCESS_TIMEOUT governs this. The variable is set to a default value of 300 (seconds) which is apparently not enough for a large clone operation using a slow internet connection.

Raise this value using:

COMPOSER_PROCESS_TIMEOUT=2000 composer install
hakre
  • 193,403
  • 52
  • 435
  • 836
Tahir Yasin
  • 11,489
  • 5
  • 42
  • 59
7

It's an old thread but I found out the reason for time out was running a php debugger (PHPStorm was listening to xdebug connections) which caused the process timeout. When I closed the PHPStorm or disabled the xdebug extension, no time out occurred.

Hadi Sharghi
  • 903
  • 16
  • 33
6

old thread but new problem for me. No solutions here were working when trying to install google/apiclient (it failed on google/apiclient-services) on an Ubuntu VM within a Windows 10 host.

After noticing Windows' "antimalware executable" taking up considerable CPU cycles when doing this composer install/update, I disabled "real-time protection" on the Windows 10 machine, and my composer update/install worked!!

Hope that helps someone.

Daydream Nation
  • 326
  • 3
  • 7
5

Deleting composer cache worked for me.

rm -rf ~/.composer/cache/*
wormhit
  • 3,687
  • 37
  • 46
4

The Symfony Component has process timeout set to 60 by default. That's why you get errors like this:

[Symfony\Component\Process\Exception\ProcessTimedOutException]     
The process "composer update" exceeded the timeout of 60 seconds. 

Solution

Set timeout to 5 minutes or more

$process = new Process("composer update");
$process->setTimeout(300); // 5 minutes
$process->run();
Mahmoud Zalt
  • 30,478
  • 7
  • 87
  • 83
  • 1
    The snippet in the question says `exceeded the timeout of 300 seconds`. So it would either need to be higher than 300, or else the timeout isn't the problem (could be a caching issue, per @nifr and @wormhit's answers). – Sean the Bean Sep 21 '17 at 19:04
1

I agree with most of what has been suggested above, but I had the same issue and what worked for me was deleting the vendor folder and re-run composer install

Regards

Ngugi Kiarie
  • 1,353
  • 11
  • 9
0

None of the solutions worked for me running on win10 wsl ubuntu (disabling firewall, removing debuggers, clearing cache, increasing timeout, deleting vendor). The only way that worked was deleting vendor and composer.lock from the main machine, copying composer.json to a fresh machine, install php and composer, run composer install (it should take less than 1 second to execute), then copying the vendor dir to the other machine, and run composer update.

pmiguelpinto90
  • 573
  • 9
  • 19
0

On Windows 11, and somewhat related to an above answer, adding a folder exclusion to real-time protection can stop the "antimalware executable" from scanning the folder and causing the timeout (and saves entirely disabling "real-time protection").

SunHunter
  • 1
  • 1
  • 1
  • 1
-2

This is the problem slow NFS. Composer write cache into NFS directory. You must install composer globally and rewrite cache path.

This doesnt work:

php composer.phar install

Using this:

composer install

Before this run you must config composer globally. See this https://getcomposer.org/doc/00-intro.md#globally

Also, you must add this lines to your config.json:

"config": {
    "cache-dir": "/var/cache/composer"
}

Works for me.

user3890355
  • 1,010
  • 13
  • 13