For security reasons (I'm a developer) I do not have command line access to our Production servers where log files are written. I can, however access those log files over HTTP. Is there a utility in the manner of "tail -f" that can "follow" a plain text file using only HTTP?
5 Answers
You can do this if the HTTP server accepts requests to return parts of a resource. For example, if an HTTP request contains the header:
Range: bytes=-500
the response will contain the last 500 bytes of the resource. You can fetch that and then parse it into lines, etc. I don't know of any ready-made clients which will do this for you - I'd write a script to do the job.
You can use Hurl to experiment with headers (from publicly available resources).

- 95,872
- 14
- 179
- 191
-
+1 This is very helpful. I had a thought of using a HEAD method to hopefully get the size of the resource to see if anything new had been added. Coupled with a GET on a specific range I might be on to something. Thanks! – kurosch Nov 02 '09 at 18:08
I wrote a bash script for the same purpose. You can find it here https://github.com/maksim07/url-tail

- 390
- 2
- 8
You can use PsExec to execute command on remote computer. The tail command for windows can be found at http://tailforwin32.sourceforge.net/
If it has to be HTTP, you can write a light weight web service to achieve that easily. e.g., read text within a specified file from line 0 to line 200.

- 26,208
- 12
- 60
- 59
-
I wouldn't be able to execute anything remotely, that would make the security people apoplectic. I was hoping something already existed before I tried rolling my own. – kurosch Oct 29 '09 at 21:50
You can use small java utility to read log file over Http using Apche HTTP Library.
HttpClient client = HttpClientBuilder.create().build();
HttpGet request = new HttpGet(uri);
HttpResponse response = client.execute(request);
BufferedReader rd = new BufferedReader(new InputStreamReader(
response.getEntity().getContent()));
String s = "";
while ((s = rd.readLine()) != null) {
//Process the line
}

- 9
- 1
I wrote a simple bash script to fetch URL content each 2 seconds and compare with local file output.txt
then append the diff to the same file
I wanted to stream AWS amplify logs in my Jenkins pipeline
while true; do comm -13 --output-delimiter="" <(cat output.txt) <(curl -s "$URL") >> output.txt; sleep 2; done
don't forget to create empty file output.txt
file first
: > output.txt
view the stream :
tail -f output.txt
UPDATE:
I found better solution using wget here:
while true; do wget -ca -o /dev/null -O output.txt "$URL"; sleep 2; done

- 1,230
- 14
- 24