0

I am currently experiencing a strange problem, I have created a large script which doesn't seem to run to completion every time.

To elaborate, the script executes a lot of queries to the database and API's to update dataon my website. It has been running for several weeks now, but before that and at the time of writing the script will stop running if it is started multiple times (2, usually) in a row.. sometimes.

So the problem is not very consistent, but it is a vital part of my website and I need it to run once every day at a specific time to keep everything running.

The script itself shouldn't be the problem. At first we had to fix a problem where it would receive a timeout. Timeouts are a big problem because the script can take up to an hour to finish.

What I've checked:

- Are any cronjobs interfering with my cronjob? * Not as far as I can see. A lot of standard maintainence crons are running, but they shouldn't cause any problems? The other cronjobs we created ourselves are not running while the problematic cronjob is.

- Do I have enough memory left? * Almost too much.

- Does the script work as it should, could it be the cause of the problems you are having? * Could be, but I don't see how. I've checked the script manually a million times.

- Did you turn on the logs? * I didn't, this might be a good idea but I don't know which log(s) to turn on. Also, we've had error-reporting problems in the past, so I'm afraid it's not going to catch anything if I turn it on.

What I'm running: - Turnkey Linux

As a finisher, the script is hard to debug, it takes about an hour to finish and afaik it never stops at the same code.

tl;dr: How should I debug a cronjob that seems to crash randomly for a while and then stops crashing for long periods of time?

Edit Thank you for the swift responses @Basile Starynkevitch: We have tried that, but it doesn't throw any errors, it just stops. There is nothing in the script that could cause that.

I cant show you relevant parts of the script I think, it's much too large. But what it all comes down to is a lot of queries and data processing.

@fedorqui: cd /var/www/cronjobs/ && /usr/bin/php -q -f sc_distri_cronjob.php It is in a public folder for testing purposes, but I don't think it has anything to do with my problem. Since the scripts runs succesfully for long periods of time, I guess the permissions are fine?

@Flosculus: We have a lock in place, this is what caught our attention. It runs "fine" once, then we run it again and it tells us the file is still locked even though it stopped running. The lock is released at the end and it contains no errors, nothing in the script could purposefully stop it from running.

Nick
  • 1,082
  • 1
  • 15
  • 27

1 Answers1

1

Since your script is very large it may run longer then your php.ini's max_execution_time.
Try the usage of set_time_limit() with a big enough amount of time.
Maybe you also track how long your script is running averagely and adjust the call.

peipst9lker
  • 605
  • 1
  • 5
  • 17