0

I am writing a debug/admin node server that allows users to execute a long-running process on the machine. I want to stream the output of the child process to the form they began the action from.

I can do this with sockets, but I have to have the client subscribe to a channel, and I have to post messages to the whole channel when they only have to do with the one client.

I'd prefer to be able to stream the http body down to the client. I can do this fairly easily with node: just keep writing to the request's socket, call end when I'm done.

Is there any way to use XhrHttpRequest to call a web service, have it fire events whenever new data is available, and a final event when it closes? Possible with jQuery?

Note that this isn't really the same use case as normal real-time updates, for which sockets are a good choice. This is a single request. I just want to get the response in pieces.

Sean Clark Hess
  • 15,859
  • 12
  • 52
  • 100
  • 1
    sounds like [long polling](http://stackoverflow.com/questions/333664/simple-long-polling-example-code) – Marshall Mar 13 '12 at 17:42
  • I know it sounds like it, but I don't want to poll. I want to keep a reference to the single request socket around and keep pumping data to it. Long polling requires some other way of keeping the data around (not just the scope of the request) – Sean Clark Hess Mar 13 '12 at 17:52
  • Although you don't want to use sockets, it seems like they would be best for this. You can have a client connect via [socket.io](http://socket.io/) and then send the data to the single (individual) socket until finished. – Marshall Mar 13 '12 at 18:07
  • You know what? You're right. I should have the client send a "request" through the socket connection, instead of over http, and just send messages back to the one guy. – Sean Clark Hess Mar 13 '12 at 19:16

3 Answers3

1

What I was hoping isn't possible: you can't make an xhr http request and keep it open, parsing chunks at a time.

Here is a summary of people's suggestions

  1. Use socket.io anyway, and change your architecture to support pushing events.
  2. Use socket.io, but make requests through it, as if you were hitting urls. Make a little url router on the server side of socket.io and stream stuff down all you want.
  3. Keep the initial html page open and parse it as you go (not feasible for my implementation)
  4. (3), but in a hidden iframe.

I went with 2.

Sean Clark Hess
  • 15,859
  • 12
  • 52
  • 100
0

As an update to this question, nowadays, you can use Sever-sent events (SSE). That way, you don't need to do anything particularly special on the server side, or setup websockets, which is overkill when you don't need full duplex. And XHR will keep the entire data in memory, which is non-ideal for large files. I had the same question, and I answered it here:

How to process streaming HTTP GET data?

Wilhelm
  • 6,506
  • 5
  • 30
  • 26
-1

some years ago i used "javascript" streaming over open http response. (years before ajax appeared)

the idea here : write chunks of

<script type="text/javascript">do js stuff here</script> 

for each step of the process you want the client to react on.

it may still work.

dweeves
  • 5,525
  • 22
  • 28
  • Writing the info to the page isn't hard. It's getting chunks of it over time from the server. – Sean Clark Hess Mar 13 '12 at 17:53
  • i meant writing the – dweeves Mar 13 '12 at 18:02
  • Cool. The thing I don't get is: How can you set up a single "never ending" client request? – Sean Clark Hess Mar 13 '12 at 19:17
  • only by not closing the response server side until the action that needs to use js script tag streaming for report ends.(which may last as long as needed) – dweeves Mar 14 '12 at 09:40
  • oooohhh, you're saying to keep the main HTML page open, not an xhr. – Sean Clark Hess Mar 15 '12 at 04:47
  • it may work using an hidden iframe that will be used to handle the "open request". thus , the streamed js tags code should use window.parent.xxxx calls to update "main page" – dweeves Mar 15 '12 at 09:37