53

I have two commands in a cron job like this:

mysql -xxxxxx -pyyyyyyyyyyv -hlocalhost -e "call MyFunction1";wget -N http://mywebsite.net/path/AfterMyFunction1.php

but it seems to me that both of them are running at the same time.

How can I make the first command run and when it completes, execute the second command?

Also the AfterMyFunction1.php have javascript http requests that are not executed when I use wget. It works if I opened AfterMyFunction1.php in my webbrowser.

Niels Castle
  • 8,039
  • 35
  • 56
medo ampir
  • 1,850
  • 7
  • 33
  • 57
  • 2
    wget is not able to parse and execute the embedded javascript in the HTML pages your retrieve. wget doesn't render the page it just retrieves the contents. – Niels Castle Mar 10 '12 at 18:35

2 Answers2

100

If the first command is required to be completed first, you should separate them with the && operator as you would in the shell. If the first fails, the second will not run.

Community
  • 1
  • 1
vincent
  • 1,305
  • 2
  • 12
  • 16
  • 10
    It seems easier and more maintainable to make a script that runs multiple files consecutively and sceduling that script versus having a big list commands separated by &&'s – JustinP Apr 04 '14 at 16:44
  • 20
    If you got only two commands I don't see where is the problem and how this file would help, also you will have to take care of that file (perms, store...), this require more time is more cost and less efficient. – Dimitri Kopriwa Jul 22 '16 at 04:58
  • 2
    I don't think this answer is correct - the semicolon will run commands sequentially, not in parallel. So I don't think anything is wrong with the OP's command line. – nonagon Oct 05 '17 at 17:43
  • @BigDong People searching to do the same thing but with a lot more of commands might find this question, so JustinP's comment is wise – dstonek May 03 '18 at 16:45
8

You could use sem which is part of GNU parallel.

0 0 * * * root  sem --jobs 1 --id MyQueue mysql -xxxxxx -pyyyyyyyyyyv -hlocalhost -e "call MyFunction1"
1 0 * * * root  sem --jobs 1 --id MyQueue wget -N http://mywebsite.net/path/AfterMyFunction1.php

This cron config will first start the mysql through sem, which will put it in a kind of queue called MyQueue. This queue will probably be empty, so the mysql is executed immediately. A minute later, the cron will start another sem which will put the wget in the same queue. With --jobs 1, sem is instructed to execute only one job at a time in that particular queue. As soon as the mysql has finished, the second sem will run the wget command. sem has plenty of options to control the queueing behaviour. For example, if you add --semaphoretimeout -60, a waiting job will simply die after 60 seconds.

The && solution is probably better, since it won't execute the second command when the first one fails. The sem solution has the advantage that you can specify different cron settings, like a different user. And it will prevent overlapping cron jobs, if the cron interval is shorter than the job duration.

Onnonymous
  • 1,391
  • 1
  • 10
  • 7
  • `sem` won't return `rc` of the utility (mysq, or wget in this case). As a result, `cron` won't send an email when the utility fails. – Vladimir Botka Jul 05 '20 at 04:39