1

I have access to a system which returns a string of data, when I request a specific URL. There's (almost) always a reasonable response from the server.

Now, I've set up a wget in cron for automatic retrieval and parsing of this data. Often, 4.5 kB of text is returned, but more often than not, I get a file of size 0. And if I then go and request the same URL in a browser, I get 4.5 kB of text.

Why does wget return an empty file, when I can get a fine response in my browser?

OZ1SEJ
  • 399
  • 1
  • 5
  • 14
  • Try finding out what status code is being returned, e.g. using this http://stackoverflow.com/questions/6136022/script-to-get-the-http-status-code-of-a-list-of-urls (for curl though) – Pekka Dec 18 '15 at 21:47
  • Network connections are fickle things, a browser hides a lot of this by trying multiple times to resolve a resource, wget by design only tries 20 times (unless you pass the argument `--tries=`) if the server does not respond at all. My hunch is you caught the server in a rare but not impossible `Connection Refused` moment (possibly some built in DDOS protection, or some load condition like an overloaded database) – Jason Sperske Dec 01 '16 at 19:46
  • Here is a detailed answer about wget that might help you http://superuser.com/a/689340 – Jason Sperske Dec 01 '16 at 19:47

0 Answers0