0

I want to run a .php every 10 min with a cron job on Ubuntu. This is my crontab:

*/10 * * * * php -f  /var/www/html/gapi/src/test2.php >/dev/null 2>&1

And this is in the log file:

CRON[9994]: (root) CMD (php -f  /var/www/html/gapi/src/test2.php >/dev/null 2>&1)

In this php is an api call, and I can see the api calls live at the dashboard of the api provider, so I know the php is not running every 10 mins.

I set the file permission to 755, what else can I do to make it work?

Updated Crontab:

*/10 * * * * php -f  /var/www/html/gapi/src/test2.php
Vaze
  • 51
  • 1
  • 6
  • 1
    What exactly do you mean by `In this php is an api call`? It sounds like you are monitoring any access via a webserver, but are calling the script from the command line. Change your request to use wget to request it via your web server if that's the case. – Eborbob Sep 04 '15 at 08:55
  • 1
    Start by not dumping potential error output to `/dev/null` and you might see why it doesn't work. – deceze Sep 04 '15 at 08:57
  • @Eborbob this sounds like it is the reason, can you please create an answer with more details? I am using a Google API to get some data, if I open the file with my browser it works fine. – Vaze Sep 04 '15 at 09:04
  • @deceze I edited it. – Vaze Sep 04 '15 at 09:05
  • So, any error messages that show up? – deceze Sep 04 '15 at 09:06
  • Try using the full path to PHP, like `/usr/bin/php5-cli`. Also when using the global crontab, you need to add the user between the intervals and the command itself. – Daniel W. Sep 04 '15 at 09:08
  • @deceze no, "Sep 4 05:10:01 Eywow CRON[12142]: (root) CMD (php -f /var/www/html/gapi/src/test2.php)" – Vaze Sep 04 '15 at 09:11
  • @DanFromGermany Like this? */10 * * * * /usr/bin/php5-cli root -f /var/www/html/gapi/src/test2.php – Vaze Sep 04 '15 at 09:12
  • don't you put your php in `public_html`? i meant, if you have `public_html` directory just put your phps there. it would be easier to call. – Oki Erie Rinaldi Sep 04 '15 at 09:14
  • and make sure your cron has enough **privileges** to access the target file! – Oki Erie Rinaldi Sep 04 '15 at 09:16
  • @OkiErieRinaldi `html` folder = `public_html`, it is just another name. – Vaze Sep 04 '15 at 09:21
  • Try this: http://stackoverflow.com/questions/2135478/how-to-simulate-the-environment-cron-executes-a-script-with. There's some difference in the environment between your regular CLI usage and cron's usage. Simulate cron's environment and execute the script exactly as cron would to clearly see what the issue is. – deceze Sep 04 '15 at 09:28
  • @Vaze `/home/user/public_html` folder is like a mirror of `/var/www`. they're are not the same directory. but, accessing `http://localhost/` is accessing `/var/www` and `home/user/public_html` = accessing both of them (if you enable `public_html` in your server config). – Oki Erie Rinaldi Sep 05 '15 at 01:38

1 Answers1

0

Try requesting the file through your web server rather than calling the script via the command line PHP interpreter.

*/10 * * * * wget -q -O /dev/null http://localhost/gapi/src/test2.php

(-q to suppress output, -O /dev/null to redirect file output so it doesn't save it)

or using curl instead:

*/10 * * * * curl --silent http://localhost/gapi/src/test2.php

The URL will depend on how your server is set up - you say it works through your browser at the moment so just use the same URL in the cron file.

Eborbob
  • 1,905
  • 1
  • 15
  • 30
  • `curl` would do this better – Oki Erie Rinaldi Sep 04 '15 at 09:17
  • @OkiErieRinaldi Why is curl better? How would it look like? – Vaze Sep 04 '15 at 09:19
  • FWIW, "-q" will silence wget and stop it outputting headers etc. I'd avoid just dumping the output to /dev/null - the script should be silent on normal operation - so at least someone will get an email when it misbehaves ... – David Goodwin Sep 04 '15 at 09:22
  • @OkiErieRinaldi I've added an example using `curl`. Take your pick. – Eborbob Sep 04 '15 at 09:23
  • 1
    There's no need, and sometimes no possibility, to involve a web server when a simple CLI invocation should do. It's an insane workaround. – deceze Sep 04 '15 at 09:26
  • @Eborbob Thank you so much, it is working, I used the wget method. – Vaze Sep 04 '15 at 09:31
  • @deceze Calling through the web server makes it more portable - if you use a FQDN instead of localhost in the cron then you can change the server the script is on and update the DNS and nothing breaks. The asker is also using the reporting functions in a web server to monitor the requests. In the grand scheme of things 1 request every 10 mins isn't going to make any difference, but for what it's worth I'd do a simple CLI invocation in this situation. – Eborbob Sep 04 '15 at 09:35
  • "More portable"?! I'd argue the opposite. It requires a web server to be set up. Especially if used with a FQDN it also requires DNS and firewall setup. All this just for running a local script...!? No, it probably won't make any difference in server load, that's not what I'd be worried about at all. – deceze Sep 04 '15 at 09:42
  • @Vaze @Eborbob because `wget` saves the webpage to your local directory but `curl` *only* access the webpage. adding `-q` parameter in wget only hide it's output, does not prevent it from saving the webpage. – Oki Erie Rinaldi Sep 07 '15 at 08:10
  • @Vaze curl can download files too, but with some extra options. http://daniel.haxx.se/docs/curl-vs-wget.html – Oki Erie Rinaldi Sep 07 '15 at 08:24