0

I use a service to generate sitemaps and i'm trying to automate the retrieval process.

I have been using wget to fetch the data and add it to my server

here is my wget statement:

wget --no-check-certificate --quiet \
  --output-document sitemap.xml \
  --method POST \
  --timeout=0 \
  --header 'Content-Type: application/x-www-form-urlencoded' \
  --body-data 'method=download_sitemap&api_key=[SECRET_KEY]&site_id=[SECRET_ID]&sitemap_id=sitemap.xml' \
   'https://pro-sitemaps.com/api/'

^this code works great for me no issues.

I cronjob -e and added the code to my cron folder using nano which looks like this

25 0 * * * "/etc/letsencrypt"/acme.sh --cron --home "/etc/letsencrypt" > /dev/null
07 18 * * * wget --no-check-certificate --quiet \  --output-document "/FILE/PATH/sitemap.xml" \  --method POST \  --timeout=0 \  --header 'Content-Type: application/x-www-form-urlencoded' \ --body-data 'method=download_sitemap&api_key=[SECRET_KEY]&site_id=[SECRET]&sitemap_id=sitemap.xml' \ 'https://pro-sitemaps.com/api/'

My problem is that my code is not running in the cron folder. I have set up the time, so my server time matches my local time. I have tried running the wget statement all on one line and removing any extra spacing in the code block. I tried shorthanding the commands (-T instead of --timeout) & I have tried adding a space at the end of each cron job. I am a bit stumped. Its probably something really simple that I missed in the documentation. Does anybody have any suggestions or notice anything off with what i'm doing in my cron folder?

I have observed these two q's which is where I have gotten the ideas so far troubleshooting: How to get CRON to call in the correct PATHs & this q: CronJob not running

Again, I have no issues when I run the wget statement in my terminal. It pulls everything just as expected. My issues is just when I put the wget command in my cron folder the command wont run.

1 Answers1

0

\ denotes line continuation, you should not use it if you have all in one line command, for example

wget --continue \
  https://www.example.com

after conversion to one line is

wget --continue https://www.example.com

Regarding cron if you have working command you might create file with it and use it through bash e.g. you might create fetcher.sh with content as follows

wget -P /path/to/catalog https://www.example.com

inside /path/to/catalog and then add

58 23 * * * bash /path/to/catalog/fetcher.sh

where /path/to/catalog is path to existing catalog, then it would download example domain into /path/to/catalog each 2 minutes to midnight.

Daweo
  • 31,313
  • 3
  • 12
  • 25