3

I'm trying to write a long executing code, but it hangs after a set amount of seconds, and I want avoid this using any workaround.

The workflow:
User send an AJAX request on the press of a button. This initates a process (function), which for example polls multiple websites for info or sends POST data using cUrl. Ideally, it should provide some info once in a while, but it would be even better if it would run in the background.

The no-no's:
The following functions cannot be used in the code: set_time_limit, exec, fork, anything pcntl related.

Possible solution:
I searched through many posts, and one possible workaround would be to split the code to multiple parts (ex. send one cUrl at a time) and have have jQuery reinitiate the connection until a given condition.

But is there a way on the server side to avoid timeout? I also tried using the Process component of Symfony, Ratchet, sockets...
Many thanks!

Edit: Fixed formatting. I forgot to mention that the code has to be reusable on any server, so editing any config files is not an option either.

  • Increase the value for timeout in php.ini file. – Ropali Munshi May 03 '19 at 10:09
  • I have to agree that the only really reliable way to avoid a time-out is to make it longer, or to [time-out the cURL calls](https://stackoverflow.com/questions/2582057/setting-curls-timeout-in-php) earlier. – KIKO Software May 03 '19 at 10:14
  • Well you can use cron, the thing is that if the script is executed from command line it has no timeout. So by POSTing your data you can 'schedule' it and cron job will do what you need – Zeusarm May 03 '19 at 10:14
  • @Zeusarm How do spawn a CLI version process using AJAX and POST? – jaszfalvi.tamas May 03 '19 at 10:16
  • 1
    Your limitations clearly imply that you can't actually achieve what you want. You can divide the entire task into many smaller tasks but each of those tasks would need to run in under the timeout time therefore you can't poll, but rather do a single request for info. These tasks can then be called through a series of AJAX requests instead of one. If you can run a background process you can also use a queue (like e.g. https://github.com/javibravo/simpleue) to queue your tasks and have them run in the background and store their results in a database which the client can request status for. – apokryfos May 03 '19 at 10:23
  • @apokryfos Thanks! Will check it out. – jaszfalvi.tamas May 03 '19 at 10:28
  • Typically big tasks are chunked down into jobs, using a queue (Amazon sqs, rabbitMQ, etc). So the client sends the API call, immediately gets a response and a bunch of jobs get shoved into a queue. A worker picks the jobs up one by one (or in parallel) and does the work this way. – Harry May 03 '19 at 10:34
  • @jaszfalvi.tamas you are storing the information which is supposed to be processed in a table, and later the CLI is "taking" this data from the table and processes it. something like a queue. I'm using it in order to index user uploaded documents in elasticsearch – Zeusarm May 03 '19 at 11:16

1 Answers1

-4

Use this function at the start of your script set_time_limit(0).

treyBake
  • 6,440
  • 6
  • 26
  • 57
Akshay Naik
  • 553
  • 3
  • 17